The media equation: how people treat computers, television, and new media like real people and places
Affective computing
SenToy: a tangible interface to control the emotions of a synthetic character
AAMAS '03 Proceedings of the second international joint conference on Autonomous agents and multiagent systems
Multimodal expressive embodied conversational agents
Proceedings of the 13th annual ACM international conference on Multimedia
Virtual rap dancer: invitation to dance
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Creating Rapport with Virtual Agents
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Copying behaviour of expressive motion
MIRAGE'07 Proceedings of the 3rd international conference on Computer vision/computer graphics collaboration techniques
Implementing expressive gesture synthesis for embodied conversational agents
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Towards affect sensitive and socially perceptive companions
Your Virtual Butler
Hi-index | 0.00 |
This paper presents a system capable of acquiring input from a video camera, processing information related to the expressivity of human movement and generating expressive copying behaviour of an Embodied Agent. We model a bi-directional communication between user and agent based on real-time analysis of movement expressivity and generation of expressive copying behaviour: while the user is moving, the agent responds with a gesture that exhibits the same expressive characteristics. An evaluation study based on a perceptual experiment with participants showed the effectiveness of the designed interaction.