Multimodal affect detection from physiological and facial features during ITS interaction

  • Authors:
  • M. S. Hussain;Rafael A. Calvo

  • Affiliations:
  • National ICT Australia, Australian Technology Park, Eveleigh, Australia and School of Electrical and Information Engineering, University of Sydney, Australia;School of Electrical and Information Engineering, University of Sydney, Australia

  • Venue:
  • AIED'11 Proceedings of the 15th international conference on Artificial intelligence in education
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multimodal approaches are increasingly used for affect detection. This paper proposes a model for the fusion of physiological signal that measure learners' heart activity and their facial expressions to detect learners' affective states while students interact with an Intelligent Tutoring System (ITS). It studies machine learning and fusion techniques that classify the system's automated feedback from the individual channels and their feature level fusion. It also evaluates the classification performance of fusion models in multimodal systems, identifying the effects of fusion over the individual modalities.