Experimental Evaluation of Automatic Hint Generation for a Logic Tutor

  • Authors:
  • John Stamper;Michael Eagle;Tiffany Barnes;Marvin Croy

  • Affiliations:
  • Human-Computer Interaction Institute, Carnegie Mellon University, Pittsburgh, PA, USA. john@stamper.org;Department of Computer Science, University of North Carolina at Charlotte, Charlotte, NC, USA. maikuusa@gmail.com;Department of Computer Science, North Carolina State University, Raleigh, NC, USA. tiffany.barnes@gmail.com;Department of Philosophy, University of North Carolina at Charlotte, Charlotte, NC, USA. mjcroy@uncc.edu

  • Venue:
  • International Journal of Artificial Intelligence in Education - Best of AIED 2011
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have augmented the Deep Thought logic tutor with a Hint Factory that generates data-driven, context-specific hints for an existing computer aided instructional tool. We investigate the impact of the Hint Factory's automatically generated hints on educational outcomes in a switching replications experiment that shows that hints help students persist in a deductive logic proofs tutor. Three instructors taught two semester-long courses, each teaching one semester using a logic tutor with hints, and one semester using the tutor without hints, controlling for the impact of different instructors on course outcomes. Our results show that students in the courses using a logic tutor augmented with automatically generated hints attempted and completed significantly more logic proof problems, were less likely to abandon the tutor, performed significantly better on a post-test implemented within the tutor, and achieved higher grades in the course.