The eNTERFACE'05 Audio-Visual Emotion Database
ICDEW '06 Proceedings of the 22nd International Conference on Data Engineering Workshops
Shybot: friend-stranger interaction for children living with autism
CHI '08 Extended Abstracts on Human Factors in Computing Systems
A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evidence Theory-Based Multimodal Emotion Recognition
MMM '09 Proceedings of the 15th International Multimedia Modeling Conference on Advances in Multimedia Modeling
Towards multimodal emotion recognition: a new approach
Proceedings of the ACM International Conference on Image and Video Retrieval
EMA: A process model of appraisal dynamics
Cognitive Systems Research
Feedback-based gameplay metrics: measuring player experience via automatic visual analysis
Proceedings of The 8th Australasian Conference on Interactive Entertainment: Playing the System
Hi-index | 0.00 |
When interacting with robots we show a plethora of affective reactions typical of natural communications. Indeed, emotions are embedded on our communications and represent a predominant communication channel to convey relevant, high impact, information. In recent years more and more researchers have tried to exploit this channel for human robot (HRI) and human computer interactions (HCI). Two key abilities are needed for this purpose: the ability to display emotions and the ability to automatically recognize them. In this work we present our system for the computer based automatic recognition of emotions and the new results we obtained on a small dataset of quasi unconstrained emotional videos extracted from TV series and movies. The results are encouraging showing a recognition rate of about 74%.