User's gestural exploration of different virtual agents' expressive profiles
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 3
Fuzzy Similarity of Facial Expressions of Embodied Agents
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Model of Facial Expressions Management for an Embodied Conversational Agent
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Facial Expression Synthesis Using PAD Emotional Parameters for a Chinese Expressive Avatar
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Simplified facial animation control utilizing novel input devices: a comparative study
Proceedings of the 14th international conference on Intelligent user interfaces
Automatic design of a control interface for a synthetic face
Proceedings of the 14th international conference on Intelligent user interfaces
Multimodal Human Machine Interactions in Virtual and Augmented Reality
Multimodal Signals: Cognitive and Algorithmic Issues
Combining Facial and Postural Expressions of Emotions in a Virtual Character
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Affect expression in ECAs: Application to politeness displays
International Journal of Human-Computer Studies
Perception of blended emotions: from video corpus to expressive agent
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Intelligent expressions of emotions
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Computing emotion awareness through facial electromyography
ECCV'06 Proceedings of the 2006 international conference on Computer Vision in Human-Computer Interaction
Hi-index | 0.00 |
We present an algorithm for generating facial expressions for a continuum of pure and mixed emotions of varying intensity. Based on the observation that in natural interaction among humans, shades of emotion are much more frequently encountered than expressions of basic emotions, a method to generate more than Ekman’s six basic emotions (joy, anger, fear, sadness, disgust and surprise) is required. To this end, we have adapted the algorithm proposed by Tsapatsoulis et al. [1] to be applicable to a physics-based facial animation system and a single, integrated emotion model. A physics-based facial animation system was combined with an equally flexible and expressive text-to-speech synthesis system, based upon the same emotion model, to form a talking head capable of expressing non-basic emotions of varying intensities. With a variety of life-like intermediate facial expressions captured as snapshots from the system we demonstrate the appropriateness of our approach.