Multiple period estimation and pitch perception model
Speech Communication
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State
IEEE Transactions on Pattern Analysis and Machine Intelligence - Graph Algorithms and Computer Vision
Linear Prediction of Speech
Modeling Emotion and Attitude in Speech by Means of Perceptually Based Parameter Values
User Modeling and User-Adapted Interaction
Describing the emotional states that are expressed in speech
Speech Communication - Special issue on speech and emotion
Emotional speech: towards a new generation of databases
Speech Communication - Special issue on speech and emotion
Modeling drivers' speech under stress
Speech Communication - Special issue on speech and emotion
The production and recognition of emotions in speech: features and algorithms
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
A computational model for the automatic recognition of affect in speech
A computational model for the automatic recognition of affect in speech
From brows to trust: evaluating embodied conversational agents
From brows to trust: evaluating embodied conversational agents
2005 Special Issue: Challenges in real-life emotion annotation and machine learning based detection
Neural Networks - Special issue: Emotion and brain
Two-stage Classification of Emotional Speech
ICDT '06 Proceedings of the international conference on Digital Telecommunications
Automatic Hierarchical Classification of Emotional Speech
ISMW '07 Proceedings of the Ninth IEEE International Symposium on Multimedia Workshops
Conversational Informatics: An Engineering Approach (Wiley Series in Agent Technology)
Conversational Informatics: An Engineering Approach (Wiley Series in Agent Technology)
Robust recognition of emotion from speech
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Hi-index | 0.00 |
Affective states and their non-verbal expressions are an important aspect of human reasoning, communication and social life. Automated recognition of affective states can be integrated into a wide variety of applications for various fields. Therefore, it is of interest to design systems that can infer the affective states of speakers from the non-verbal expressions in speech, occurring in real scenarios. This paper presents such a system and the framework for its design and validation. The framework defines a representation method that comprises a set of affective-state groups or archetypes that often appear in everyday life. The inference system is designed to infer combinations of affective states that can occur simultaneously and whose level of expression can change over time. The framework considers also the validation and generalisation of the system. The system was built of 36 independent pair-wise comparison machines, with average accuracy (tenfold cross-validation) of 75%. The accumulated inference system yielded total accuracy of 83% and recognised combinations for different nuances within the affective-state groups. In addition to the ability to recognise these affective-state groups, the inference system was applied to characterisation of a very large variety of affective state concepts (549 concepts) as combinations of the affective-state groups. The system was also applied to annotation of affective states that were naturally evoked during sustained human-computer interactions and multi-modal analysis of the interactions, to new speakers and to a different language, with no additional training. The system provides a powerful tool for recognition, characterisation, annotation (interpretation) and analysis of affective states. In addition, the results inferred from speech in both English and Hebrew, indicate that the vocal expressions of complex affective states such as thinking, certainty and interest transcend language boundaries.