A first evaluation study of a database of kinetic facial expressions (DaFEx)
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
A systematic discussion of fusion techniques for multi-modal affect recognition tasks
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Hi4D-ADSIP 3-D dynamic facial articulation database
Image and Vision Computing
Hi-index | 0.00 |
DaFEx (Database of Facial Expressions) is a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (ECAs). DaFEx consists of 1008 short videos containing emotional facial expressions of the 6 Ekman's emotions plus the neutral expression. The facial expressions were recorded by 8 italian professional actors (4 male and 4 female) in two acting conditions (“utterance” and “no- utterance”) and at 3 intensity levels (high, medium, low). Very much attention has been paid to image quality and framing. The high number of videos, the number of variables considered, and the very good video quality, make of DaFEx a reference corpus both for the evaluation of ECAs and the research in emotion psychology.