Using information extraction to generate trigger questions for academic writing support

  • Authors:
  • Ming Liu;Rafael A. Calvo

  • Affiliations:
  • University of Sydney, Sydney, NSW, Australia;University of Sydney, Sydney, NSW, Australia

  • Venue:
  • ITS'12 Proceedings of the 11th international conference on Intelligent Tutoring Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automated question generation approaches have been proposed to support reading comprehension. However, these approaches are not suitable for supporting writing activities. We present a novel approach to generate different forms of trigger questions (directive and facilitative) aimed at supporting deep learning. Useful semantic information from Wikipedia articles is extracted and linked to the key phrases in a students' literature review, particularly focusing on extracting information containing 3 types of relations (Kind of, Similar-to and Different-to) by using syntactic pattern matching rules. We collected literature reviews from 23 Engineering research students, and evaluated the quality of 306 computer generated questions and 115 generic questions. Facilitative questions are more useful when it comes to deep learning about the topic, while directive questions are clearer and useful for improving the composition.