Web Wisdom: How to Evaluate and Create Web Page Quality
Web Wisdom: How to Evaluate and Create Web Page Quality
Reliability and verification of natural language text on the world wide web (abstract only)
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Development and use of a gold-standard data set for subjectivity classifications
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Inter-coder agreement for computational linguistics
Computational Linguistics
Discourse level opinion relations: an annotation study
SIGdial '08 Proceedings of the 9th SIGdial Workshop on Discourse and Dialogue
From annotator agreement to noise models
Computational Linguistics
Hi-index | 0.00 |
In this paper we present a detailed scheme for annotating medical web pages designed for health care consumers. The annotation is along two axes: first, by reliability or the extent to which the medical information on the page can be trusted, second, by the type of page (patient leaflet, commercial, link, medical article, testimonial, or support). We analyze inter-rater agreement among three judges for each category. Inter-rater agreement was moderate (0.77 accuracy, 0.62 F-measure, 0.49 Kappa) on the reliability axis and good (0.81 accuracy, 0.72 F-measure, 0.73 Kappa) along the type axis.