Comparing and evaluating real time character engines for virtual environments
Presence: Teleoperators and Virtual Environments
Interpretation of emotional body language displayed by robots
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Emotional body language displayed by artificial agents
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
Children interpretation of emotional body language displayed by a robot
ICSR'11 Proceedings of the Third international conference on Social Robotics
Hi-index | 0.00 |
Humans use their bodies in a highly expressive way during conversation, and animated characters that lack this form of non-verbal expression can seem stiff and unemotional. An important aspect of non-verbal expression is that people respond to each other's behavior and are highly attuned to picking up this type of response. This is particularly important for the feedback given while listening to some one speak. However, automatically generating this type of behavior is difficult as it is highly complex and subtle. This paper takes a data driven approach to generating interactive social behavior. Listening behavior is motion captured, together with the audio being listened to. These data are used to learn an animation model of the responses of one person to the other. This allows us to create characters that respond in real-time during a conversation with a real human. Copyright 2008 John Wiley & Sons, Ltd.