A multilayer personality model
Proceedings of the 2nd international symposium on Smart graphics
MPEG-4 Facial Animation: The Standard,Implementation and Applications
MPEG-4 Facial Animation: The Standard,Implementation and Applications
SenToy in FantasyA: Designing an Affective Sympathetic Interface to a Computer Game
Personal and Ubiquitous Computing
ALMA: a layered model of affect
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
MPEG-4 facial expression synthesis
Personal and Ubiquitous Computing
Perception of blended emotions: from video corpus to expressive agent
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Simplified facial animation control utilizing novel input devices: a comparative study
Proceedings of the 14th international conference on Intelligent user interfaces
Impact of Expressive Wrinkles on Perception of a Virtual Character's Facial Expressions of Emotions
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Hi-index | 0.00 |
Designing affective user interfaces involving expressive characters raises several questions. The system should be able to display facial expressions of complex emotions as dynamic and realtime reactions to user's inputs. From a cognitive point of view, designers need to know how the user will perceive the dynamics of these facial expressions as a function of his/her input. We aim at evaluating if users can perceive different expressive profiles of a virtual character by manually controlling its expressions and observing its reaction to his/her input. This paper describes our platform that enables a virtual character to display blended facial expressions of emotions as realtime continuous reactions to users' gesture input. We explain the techniques underlying the computation of intermediate facial expressions of emotion, and their control in the 3D space PAD (Pleasure, Arousal, Dominance) using gesture input. Preliminary results of a perceptive study show the potential of such an approach for assessing the dynamics of the perception of emotional expressions during gesture interaction with virtual characters endowed with different expressive profiles.