chi2_kappa(M)

Computes the Measure of Agreement Kappa test using the data in matrix M. Matrix M must be a contingency table with 2 or more categories. The current implementation only supports 2 raters.


See also chi2_goodness chi2_independent chi2_mcnemar chi2_risk 


Example


The contingency table below reports the agreements and disagreements of two judges.:


First judge
Second judge
Total
Very good
Just OK
Very good
4
1
5
Just OK
2
5
7
Total
6
6
12

Entering the basic data in a matrix we obtain:

M = ( 4, 1; 2, 5)

/ 4 1 \

\ 2 5 /


We now use function chi2_totable to transform the matrix into a proper contingency table

M=chi2_totable(M)

/ "" "" "" "" \

| ""  4  1  5 |

| ""  2  5  7 |

\ ""  6  6 12 /


Using the function chi2_kappa we obtain,

chi2_kappa(M)


Hypothesis test assumes a one-tail test for association. 
For other tests, use the two-tails probability.

Measure of Agreement Kappa.

Results:

Test statistics Kappa : 0.5

Asymp. Std. Error: 0.24650332429581734

Approx T: 1.7566201313073595

Approx Sig.: 0.07898257649095952