Embodied conversational agents on a common ground
From brows to trust
A first evaluation study of a database of kinetic facial expressions (DaFEx)
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Fuzzy Similarity of Facial Expressions of Embodied Agents
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Affect expression in ECAs: Application to politeness displays
International Journal of Human-Computer Studies
Perception of blended emotions: from video corpus to expressive agent
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Perceiving visual emotions with speech
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
The properties of DaFEx, a database of kinetic facial expressions
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
Embodied Conversational Agents that can express emotions are a popular topic. Yet, despite recent attempts, reliable methods are still lacking to assess the quality of facial displays. This paper extends and refines the work in [6], focusing on the role of the upper and the lower portions of the face. We analysed the recognition rates and errors from the responses of 74 subjects to the presentations of dynamic (human and synthetic) faces. The results points to the possibility of: a) addressing the issue of the naturalness of synthetic faces, and b) a greater importance of the upper part.