Inter-Rater Reliability Calculator
Enter your confusion matrix data into the Inter-Rater Reliability Calculator to compute Cohen's Kappa — the gold-standard measure of agreement between two raters or observers. Select the number of categories (2–5), fill in the observed frequency cells of your contingency table, and get back the kappa coefficient, percent agreement, and a strength-of-agreement interpretation. Supports both nominal and ordered category data.
Results
Cohen's Kappa (κ)
--
Percent Agreement
--
Expected Agreement (by Chance)
--
Total Observations
--
Strength of Agreement
--