Investigating acoustic cues in automatic detection of learners' emotion from auto tutor

  • Authors:
  • Rui Sun;Elliot Moore

  • Affiliations:
  • Georgia Institute of Technology, School of Electrical and Computer Engineering, Technology Circle, Savannah, GA;Georgia Institute of Technology, School of Electrical and Computer Engineering, Technology Circle, Savannah, GA

  • Venue:
  • ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This study investigates the emotion-discriminant ability of acoustic cues from speech collected in the automatic computer tutoring system named as Auto Tutor. The purpose of this study is to examine the acoustic cues for emotion detection of the speech channel from the learning system, and to compare the emotion-discriminant performance of acoustic cues (in this study) with the conversational cues (available in previous work). Comparison between the classification performance obtained using acoustic cues and conversational cues shows that the emotions: flow and boredom are better captured in acoustics than conversational cues while conversational cues play a more important role in multiple-emotion classification.