Embodied conversational interface agents
Communications of the ACM
MPEG-4 Facial Animation: The Standard, Implementation and Applications
MPEG-4 Facial Animation: The Standard, Implementation and Applications
Recognising emotions in human and synthetic faces: the role of the upper and lower parts of the face
Proceedings of the 10th international conference on Intelligent user interfaces
Embodied conversational agents on a common ground
From brows to trust
Emotion recognition using facial expressions with active appearance models
HCI '08 Proceedings of the Third IASTED International Conference on Human Computer Interaction
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Automatic initialization for facial analysis in interactive robotics
ICVS'08 Proceedings of the 6th international conference on Computer vision systems
Fusion of audio- and visual cues for real-life emotional human robot interaction
DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
DaFEx: database of facial expressions
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
Evaluating the effect of emotion on gender recognition in virtual humans
Proceedings of the ACM Symposium on Applied Perception
Hi-index | 0.00 |
In this paper we present DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (ECAs). DaFEx consists of 1008 short videos containing emotional facial expressions of the 6 Ekman's emotions plus the neutral expression. The facial expressions were recorded by 8 professional actors (male and female) in two acting conditions ("utterance" and "no- utterance") and at 3 intensity levels (high, medium, low). The properties of DaFEx were studied by having 80 subjects classify the emotion expressed in the videos. High rates of accuracy were obtained for most of the emotions displayed. We also tested the effect of the intensity level, of the articulatory movements due to speech, and of the actors' and subjects' gender, on classification accuracy. The results showed that decoding accuracy decreases with the intensity of emotions; that the presence of articulatory movements negatively affects the recognition of fear, surprise and of the neutral expression, while it improves the recognition of anger; and that facial expressions seem to be recognized (slightly) better when acted by actresses than by actors.