CorpVis: An Online Emotional Speech Corpora Visualisation Interface
SAMT '09 Proceedings of the 4th International Conference on Semantic and Digital Media Technologies: Semantic Multimedia
Computer Speech and Language
Designing and evaluating a wizarded uncertainty-adaptive spoken dialogue tutoring system
Computer Speech and Language
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
Using affective parameters in a content-based recommender system for images
User Modeling and User-Adapted Interaction
Enhancing emotion recognition from speech through feature selection
TSD'10 Proceedings of the 13th international conference on Text, speech and dialogue
Layered evaluation of interactive adaptive systems: framework and formative methods
User Modeling and User-Adapted Interaction
Emotion on the road: necessity, acceptance, and feasibility of affective computing in the car
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
Affect recognition in real life scenarios
Proceedings of the Third COST 2102 international training school conference on Toward autonomous, adaptive, and context-aware multimodal interfaces: theoretical and practical issues
On the impact of children's emotional speech on acoustic and language models
EURASIP Journal on Audio, Speech, and Music Processing - Special issue on atypical speech
Classification of emotional speech using 3DEC hierarchical classifier
Speech Communication
Affective speech interface in serious games for supporting therapy of mental disorders
Expert Systems with Applications: An International Journal
Paralinguistics in speech and language-State-of-the-art and the challenge
Computer Speech and Language
Speaker state recognition using an HMM-based feature extraction method
Computer Speech and Language
NAACL HLT '12 Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Contextual and active learning-based affect-sensing from virtual drama improvisation
ACM Transactions on Speech and Language Processing (TSLP)
Towards a Semantic-Based Approach for Affect and Metaphor Detection
International Journal of Distance Education Technologies
Using Emotional Intelligence in Training Crisis Managers: The Pandora Approach
International Journal of Distance Education Technologies
Computer Speech and Language
Hi-index | 0.00 |
The `traditional' first two dimensions in emotion research are VALENCE and AROUSAL. Normally, they are obtained by using elicited, acted data. In this paper, we use realistic, spontaneous speech data from our `AIBO' corpus (human-robot communication, children interacting with Sony's AIBO robot). The recordings were done in a Wizard-of-Oz scenario: the children believed that AIBO obeys their commands; in fact, AIBO followed a fixed script and often disobeyed. Five labellers annotated each word as belonging to one of eleven emotion-related states; seven of these states which occurred frequently enough are dealt with in this paper. The confusion matrices of these labels were used in a Non-Metrical Multi-dimensional Scaling to display two dimensions; the first we interpret as VALENCE, the second, however, not as AROUSAL but as INTERACTION, i.e., addressing oneself (angry, joyful) or the communication partner (motherese, reprimanding). We show that it depends on the specifity of the scenario and on the subjects' conceptualizations whether this new dimension can be observed, and discuss impacts on the practice of labelling and processing emotional data. Two-dimensional solutions based on acoustic and linguistic features that were used for automatic classification of these emotional states are interpreted along the same lines.