Inter-coder agreement for computational linguistics

  • Authors:
  • Ron Artstein;Massimo Poesio

  • Affiliations:
  • -;-

  • Venue:
  • Computational Linguistics
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

This article is a survey of methods for measuring agreement among corpus annotators. It exposes the mathematics and underlying assumptions of agreement coefficients, covering Krippendorff's alpha as well as Scott's pi and Cohen's kappa; discusses the use of coefficients in several annotation tasks; and argues that weighted, alpha-like coefficients, traditionally less used than kappa-like measures in computational linguistics, may be more appropriate for many corpus annotation tasks---but that their use makes the interpretation of the value of the coefficient even harder.