Experimental evaluation of automatic hint generation for a logic tutor

  • Authors:
  • John C. Stamper;Michael Eagle;Tiffany Barnes;Marvin Croy3

  • Affiliations:
  • Human-Computer Institute, Carnegie Mellon University;Department of Computer Science, University of North Carolina at Charlotte;Department of Computer Science, University of North Carolina at Charlotte;Department of Philosophy, University of North Carolina at Charlotte

  • Venue:
  • AIED'11 Proceedings of the 15th international conference on Artificial intelligence in education
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In our prior work we showed it was feasible to augment a logic tutor with a data-driven Hint Factory that uses data to automatically generate context-specific hints for an existing computer aided instructional tool. Here we investigate the impact of automatically generated hints on educational outcomes in a robust experiment that shows that hints help students persist in deductive logic courses. Three instructors taught two semester-long courses, each teaching one semester using a logic tutor with hints, and one semester using the tutor without hints, controlling for the impact of different instructors on course outcomes. Our results show that students in the courses using a logic tutor augmented with automatically generated hints attempted and completed significantly more logic proof problems, were less likely to abandon the tutor, and performed significantly better on a post-test implemented within the tutor.