Recognition of Affective Communicative Intent in Robot-Directed Speech
Autonomous Robots
Baby ears: a recognition system for affective vocalizations
Speech Communication
Motivation Diagnosis in Intelligent Tutoring Systems
ITS '98 Proceedings of the 4th International Conference on Intelligent Tutoring Systems
Emotion Detection from Speech to Enrich Multimedia Content
PCM '01 Proceedings of the Second IEEE Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Analysis of emotion recognition using facial expressions, speech and multimodal information
Proceedings of the 6th international conference on Multimodal interfaces
Emotions in Speech: Juristic Implications
Speaker Classification I
Extracting emotion from speech: towards emotional speech-driven facial animations
SG'03 Proceedings of the 3rd international conference on Smart graphics
Content-Based affective image classification and retrieval using support vector machines
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
This paper reports results from preliminary experiments on automatic classification of spoken affect valence. The task was to classify short spoken sentences into one of two classes: approving or disapproving. Using an optimal combination of six acoustic measurements our classifier achieved an accuracy of 65% to 88% for speaker dependent, text-independent classification. The results suggest that pitch and energy measurements may be used to automatically classify spoken affect valence but more research will be necessary to understand individual variations and how to broaden the range of affect classes which can be recognized. In a second experiment we compared human performance in classifying the same speech samples. We found similarities between human and automatic classification results.