Dynamics of Facial Expression Extracted Automatically from Video
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 5 - Volume 05
2005 Special Issue: Challenges in real-life emotion annotation and machine learning based detection
Neural Networks - Special issue: Emotion and brain
ASR for emotional speech: Clarifying the issues and enhancing performance
Neural Networks - Special issue: Emotion and brain
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Automatic nonverbal analysis of social interaction in small groups: A review
Image and Vision Computing
Cost-Effective Solution to Synchronized Audio-Visual Capture Using Multiple Sensors
AVSS '09 Proceedings of the 2009 Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance
Automatic understanding of affective and social signals by multimodal mimicry recognition
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Recreation of spontaneous non-verbal behavior on a synthetic agent EVA
AIKED'12 Proceedings of the 11th WSEAS international conference on Artificial Intelligence, Knowledge Engineering and Data Bases
Modeling dominance effects on nonverbal behaviors using granger causality
Proceedings of the 14th ACM international conference on Multimodal interaction
Contextual and active learning-based affect-sensing from virtual drama improvisation
ACM Transactions on Speech and Language Processing (TSLP)
Form-Oriented annotation for building a functionally independent dictionary of synthetic movement
COST'11 Proceedings of the 2011 international conference on Cognitive Behavioural Systems
Image and Vision Computing
Affect detection from text-based virtual improvisation and emotional gesture recognition
Advances in Human-Computer Interaction
Towards a Semantic-Based Approach for Affect and Metaphor Detection
International Journal of Distance Education Technologies
Hi-index | 0.00 |
In this paper we introduce a multi-modal database for the analysis of human interaction, in particular mimicry, and elaborate on the theoretical hypotheses of the relationship between the occurrence of mimicry and human affect. The recorded experiments are designed to explore this relationship. The corpus is recorded with 18 synchronised audio and video sensors, and is annotated for many different phenomena, including dialogue acts, turn-taking, affect, head gestures, hand gestures, body movement and facial expression. Recordings were made of two experiments: a discussion on a political topic, and a role-playing game. 40 participants were recruited, all of whom selfreported their felt experiences. The corpus will be made available to the scientific community.