Affect detection from multichannel physiology during learning sessions with AutoTutor

  • Authors:
  • M. S. Hussain;Omar AlZoubi;Rafael A. Calvo;Sidney K. D'Mello

  • Affiliations:
  • National ICT Australia, Australian Technology Park, Eveleigh, Australia and School of Electrical and Information Engineering, University of Sydney, Australia;School of Electrical and Information Engineering, University of Sydney, Australia;School of Electrical and Information Engineering, University of Sydney, Australia;Institute for Intelligent Systems, University of Memphis, Memphis

  • Venue:
  • AIED'11 Proceedings of the 15th international conference on Artificial intelligence in education
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is widely acknowledged that learners experience a variety of emotions while interacting with Intelligent Tutoring Systems (ITS), hence, detecting and responding to emotions might improve learning outcomes. This study uses machine learning techniques to detect learners' affective states from multichannel physiological signals (heart activity, respiration, facial muscle activity, and skin conductivity) during tutorial interactions with AutoTutor, an ITS with conversational dialogues. Learners were asked to self-report (both discrete emotions and degrees of valence/arousal) the affective states they experienced during their sessions with AutoTutor via a retrospective judgment protocol immediately after the tutorial sessions. In addition to mapping the discrete learning-centered emotions (e.g., confusion, frustration, etc) on a dimensional valence/arousal space, we developed and validated an automatic affect classifier using physiological signals. Results indicate that the classifier was moderately successful at detecting naturally occurring emotions during the AutoTutor sessions.