Affective computing
Affective interactions
ALMA: a layered model of affect
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
2005 Special Issue: A systems approach to appraisal mechanisms in emotion
Neural Networks - Special issue: Emotion and brain
2005 Special Issue: A systems approach to appraisal mechanisms in emotion
Neural Networks - Special issue: Emotion and brain
Parameterized facial expression synthesis based on MPEG-4
EURASIP Journal on Applied Signal Processing
Implementing expressive gesture synthesis for embodied conversational agents
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
A domain-independent framework for modeling emotion
Cognitive Systems Research
Modeling Emotional Expressions as Sequences of Behaviors
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Appraising emotional events during a real-time interactive game
Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots
Towards ECA's animation of expressive complex behaviour
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
Natural language scripting within conversational agent design
Applied Intelligence
An approach to conversational agent design using semantic sentence similarity
Applied Intelligence
Hi-index | 0.00 |
Appraisal theories in psychology study facial expressions in order to deduct information regarding the underlying emotion elicitation processes. Scherer's component process model provides predictions regarding particular face muscle deformations that are attributed as reactions to the cognitive appraisal stimuli in the study of emotion episodes. In the current work, MPEG-4 facial animation parameters are used in order to evaluate these theoretical predictions for intermediate and final expressions of a given emotion episode. We manipulate parameters such as intensity and temporal evolution of synthesized facial expressions. In emotion episodes originating from identical stimuli, by varying the cognitive appraisals of the stimuli and mapping them to different expression intensities and timings, various behavioral patterns can be generated and thus different agent character profiles can be defined. The results of the synthesis process are consequently applied to Embodied Conversational Agents (ECAs), aiming to render their interaction with humans, or other ECAs, more affective.