Assessment of different workflow strategies for annotating discourse relations: a case study with HDRB

  • Authors:
  • Himanshu Sharma;Praveen Dakwale;Dipti M. Sharma;Rashmi Prasad;Aravind Joshi

  • Affiliations:
  • LTRC, IIIT-Hyderabad, India;LTRC, IIIT-Hyderabad, India;LTRC, IIIT-Hyderabad, India;University of Wisconsin-Milwaukee;University of Pennsylvania

  • Venue:
  • CICLing'13 Proceedings of the 14th international conference on Computational Linguistics and Intelligent Text Processing - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present our experiments with different annotation workflows for annotating discourse relations in the Hindi Discourse Relation Bank(HDRB). In view of the growing interest in the development of discourse data-banks based on the PDTB framework and the complexities associated with the discourse annotation, it is important to study and analyze approaches and practices followed in the annotation process. The ultimate goal is to find an optimal balance between accurate description of discourse relations and maximal inter-rater reliability. We address the question of the choice of annotation work-flow for discourse and how it affects the consistency and hence the quality of annotation. We conduct multiple annotation experiments using different work-flow strategies, and evaluate their impact on inter-annotator agreement. Our results show that the choice of annotation work-flow has a significant effect on the annotation load and the comprehension of discourse relations for annotators, as is reflected in the inter-annotator agreement results.