Cohen's Kappa Calculator
Enter your confusion matrix values into the Cohen's Kappa Calculator — fill in the observed agreement cells for two raters across 2 to 5 categories, and get back Cohen's Kappa (κ), observed agreement, expected agreement, and an interpretation of the inter-rater reliability level. Choose the number of categories, enter the cell counts, and the kappa statistic is computed for you. Also try the calculate Bootstrap Bootstrap Estimate.
Results
Cohen's Kappa (κ)
--
Observed Agreement (Po)
--
Expected Agreement (Pe)
--
Total Observations (N)
--
Agreement Level
--