Inter-Rater Reliability: Definition, Examples & Assessing - Statistics By Jim
AgreeStat/360: computing agreement coefficients (Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) with ratings in the form of a distribution of raters by subject and category
Inter rater reliability using Fleiss Kappa - YouTube
How to Calculate Fleiss' Kappa in Excel? - GeeksforGeeks
Cohen's Kappa and Fleiss' Kappa— How to Measure the Agreement Between Raters | by Audhi Aprilliant | Medium