Affective computing
Wearable and automotive systems for affect recognition from physiology
Wearable and automotive systems for affect recognition from physiology
Analysis of emotion recognition using facial expressions, speech and multimodal information
Proceedings of the 6th international conference on Multimodal interfaces
Emotions and heart rate while sitting on a chair
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Recognizing emotions for the audio-visual document indexing
ISCC '04 Proceedings of the Ninth International Symposium on Computers and Communications 2004 Volume 2 (ISCC"04) - Volume 02
Bi-modal emotion recognition from expressive face and body gestures
Journal of Network and Computer Applications
Towards an Algebraic Modeling of Emotional States
ICIW '10 Proceedings of the 2010 Fifth International Conference on Internet and Web Applications and Services
EmotionML - an upcoming standard for representing emotions and related states
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Emotion recognition using physiological signals
ICAT'06 Proceedings of the 16th international conference on Advances in Artificial Reality and Tele-Existence
Hi-index | 0.00 |
Emotions play a crucial role in human-computer interaction. They are generally expressed and perceived through multiple modalities such as speech, facial expressions, physiological signals. Indeed, the complexity of emotions makes the acquisition very difficult and makes unimodal systems i.e., the observation of only one source of emotion unreliable and often unfeasible in applications of high complexity. Moreover the lack of a standard in human emotions modeling hinders the sharing of affective information between applications. In this paper, the authors present a multimodal approach for the emotion recognition from many sources of information. This paper aims to provide a multi-modal system for emotion recognition and exchange that will facilitate inter-systems exchanges and improve the credibility of emotional interaction between users and computers. The authors elaborate a multimodal emotion recognition method from Physiological Data based on signal processing algorithms. The authors' method permits to recognize emotion composed of several aspects like simulated and masked emotions. This method uses a new multidimensional model to represent emotional states based on an algebraic representation. The experimental results show that the proposed multimodal emotion recognition method improves the recognition rates in comparison to the unimodal approach. Compared to the state of art multimodal techniques, the proposed method gives a good results with 72% of correct.