Complex linguistic annotation --- no easy way out!: a case from Bangla and Hindi POS labeling tasks

  • Authors:
  • Sandipan Dandapat;Priyanka Biswas;Monojit Choudhury;Kalika Bali

  • Affiliations:
  • Dublin City University, Ireland;LDCIL, CIIL-Mysore, India;Microsoft Research Labs India, Bangalore, India;Microsoft Research Labs India, Bangalore, India

  • Venue:
  • ACL-IJCNLP '09 Proceedings of the Third Linguistic Annotation Workshop
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Alternative paths to linguistic annotation, such as those utilizing games or exploiting the web users, are becoming popular in recent times owing to their very high benefit-to-cost ratios. In this paper, however, we report a case study on POS annotation for Bangla and Hindi, where we observe that reliable linguistic annotation requires not only expert annotators, but also a great deal of supervision. For our hierarchical POS annotation scheme, we find that close supervision and training is necessary at every level of the hierarchy, or equivalently, complexity of the tagset. Nevertheless, an intelligent annotation tool can significantly accelerate the annotation process and increase the inter-annotator agreement for both expert and non-expert annotators. These findings lead us to believe that reliable annotation requiring deep linguistic knowledge (e.g., POS, chunking, Treebank, semantic role labeling) requires expertise and supervision. The focus, therefore, should be on design and development of appropriate annotation tools equipped with machine learning based predictive modules that can significantly boost the productivity of the annotators.