Theoretical foundations for intelligent tutoring systems
Journal of Artificial Intelligence in Education
Using Evaluation to Shape ITS Design: Results and Experiences with SQL-Tutor
User Modeling and User-Adapted Interaction
"How do you know that I don't understand?" A look at the future of intelligent tutoring systems
Computers in Human Behavior
Towards Knowledge-Based Affective Interaction: Situational Interpretation of Affect
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
Many software systems would significantly improve performance if they could interpret the nonverbal cues in their user's interactions as humans normally do. Currently, Intelligent Tutoring Systems (ITSs) (and other software systems) are unable to use nonverbal cues to interpret student's responses to instructional material as can human tutors. We believe that this capability is essential to adapt teaching strategy to the needs of the learner. An experiment was performed aimed at identifying what kinds of gestures are being used by students in a human-to-human learning context. We have identified a range of gestures being used in one-to-one tutoring environments and a dependency of gesture use on students' skill level. As a result, we suggest how the student model in an ITS should reflect this dependency. These results are applicable to HCI in general.