Automatic question generation for literature review writing support

  • Authors:
  • Ming Liu;Rafael A. Calvo;Vasile Rus

  • Affiliations:
  • University of Sydney, Sydney, NSW, Australia;University of Sydney, Sydney, NSW, Australia;University of Memphis, Memphis, TN

  • Venue:
  • ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel Automatic Question Generation (AQG) approach that generates trigger questions as a form of support for students' learning through writing. The approach first automatically extracts citations from students' compositions together with key content elements. Next, the citations are classified using a rule-based approach and questions are generated based on a set of templates and the content elements. A pilot study using the Bystander Turing Test investigated differences in writers' perception between questions generated by our AQG system and humans (Human Tutor, Lecturer, or Generic Question). It is found that the human evaluators have moderate difficulties distinguishing questions generated by the proposed system from those produced by human (F-score=0.43). Moreover, further results show that our system significantly outscores Generic Question on overall quality measures.