2005 Special Issue: Emotion recognition in human-computer interaction
Neural Networks - Special issue: Emotion and brain
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Automatic detection of learner's affect from conversational cues
User Modeling and User-Adapted Interaction
Automatic Detection of Learner's Affect From Gross Body Language
Applied Artificial Intelligence
Emotions and Learning with AutoTutor
Proceedings of the 2007 conference on Artificial Intelligence in Education: Building Technology Rich Learning Contexts That Work
Investigating glottal parameters for differentiating emotional categories with similar prosodics
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
User Modeling and User-Adapted Interaction
Estimation of Glottal Closure Instants in Voiced Speech Using the DYPSA Algorithm
IEEE Transactions on Audio, Speech, and Language Processing
Analysis of Emotionally Salient Aspects of Fundamental Frequency for Emotion Detection
IEEE Transactions on Audio, Speech, and Language Processing
Hi-index | 0.00 |
This study investigates the emotion-discriminant ability of acoustic cues from speech collected in the automatic computer tutoring system named as Auto Tutor. The purpose of this study is to examine the acoustic cues for emotion detection of the speech channel from the learning system, and to compare the emotion-discriminant performance of acoustic cues (in this study) with the conversational cues (available in previous work). Comparison between the classification performance obtained using acoustic cues and conversational cues shows that the emotions: flow and boredom are better captured in acoustics than conversational cues while conversational cues play a more important role in multiple-emotion classification.