The kappa statistic: a second look

  • Authors:
  • Barbara Di Eugenio;Michael Glass

  • Affiliations:
  • University of Illinois at Chicago, Computer Science, 1120 SEO (M/C 152), 851 South Morgan Street, Chicago, IL;Valparaiso University, Mathematics and Computer Science, 116 Gellerson Hall, Valparaiso, IN

  • Venue:
  • Computational Linguistics
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In recent years, the kappa coefficient of agreement has become the de facto standard for evaluating intercoder agreement for tagging tasks. In this squib, we highlight issues that affect κ and that the community has largely neglected. First, we discuss the assumptions underlying different computations of the expected agreement component of κ. Second, we discuss how prevalence and bias affect the κ measure.