Description

The kappa coefficient (or kappa statistic) is a measure of agreement between 2 observers interpreting data. It measures the ratio of the actual agreement between observers beyond chance divided by the potential agreement beyond chance. The number of observation choices may range from two or more. In this section we will calculate the kappa statistic when there are 2 possible interpretations per observation, giving a 2x2 table.


Observer 2

Observer 1

 

 

present or positive

absent or negative

subtotal

present or positive

a

b

a + b

absent or negative

c

d

c + d

subtotal

a + c

b + d

a + b + c + d

 

observed agreement  as a proportion =

= (a + d) / (a + b + c + d)

 

expected agreement by chance as a proportion =

= (((a + b) * (a+c)) + ((c + d) * (b + d))) / ((a + b + c + d)^2)

 

observed agreement  as a frequency (raw count) =

= (a + d)

 

expected agreement by chance as a frequency =

= (((a + b) * (a+c)) + ((c + d) * (b + d))) / (a + b + c + d)

 

The kappa coefficient can either be calculated using the proportion or fraction of the total number, or by using the raw count. The difference is that each proportion is multiplied by the total number, since the proportion = (number showing feature) / (total number).

 

kappa by proportion =

= ((observed agreement as a proportion) – (expected agreement by chance as a proportion)) / (1 – (expected agreement by chance as a proportion))

 

kappa by frequency =

= ((observed agreement as a frequency) – (expected agreement by chance as a frequency)) / ((a + b + c + d) – (expected agreement by chance as a frequency))

 

standard deviation =

= SQRT (((observed agreement) * (1 – (observed agreement))) / (((total number) * ((1 – (expected agreement by chance))^2)))

 

95% confidence interval =

= (calculated kappa) +/- (1.96 * (standard deviation))

 

Interpretation:

• minimum value for kappa statistic: < 0

• maximum value: 1

• The higher the number, the greater the level of agreement between the 2 observers.

 

Result for Kappa

Strength of Agreement

< 0.00

poor

0.00 – 0.20

slight

0.21 – 0.40

fair

0.41 – 0.60

moderate

0.61 – 0.80

substantial

0.81 – 1.00

almost perfect

from page 165, Landis and Koch (1977)

 


To read more or access our algorithms and calculators, please log in or register.