International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Design and evaluation of expressive gesture synthesis for embodied conversational agents
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
Natural behavior of a listening agent
Lecture Notes in Computer Science
Virtual rap dancer: invitation to dance
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Developing multimodal interactive systems with EyesWeb XMI
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Detecting Affect from Non-stylised Body Motions
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Gesture-Based Human-Computer Interaction and Simulation
Investigating the role of body shape on the perception of emotion
ACM Transactions on Applied Perception (TAP)
Automatic temporal segment detection and affect recognition from face and body display
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
Automatic analysis of affective postures and body motion to detect engagement with a game companion
Proceedings of the 6th international conference on Human-robot interaction
Emotion-Oriented Systems: The Humaine Handbook
Emotion-Oriented Systems: The Humaine Handbook
Hi-index | 0.00 |
We present an evaluation of copying behaviour in an embodied agent capable of processing expressivity characteristics of a user's movement and conveying aspects of it in real-time. The agent responds to affective cues from gestures performed by actors, producing synthesised gestures that exhibit similar expressive qualities. Thus, copying is performed only at the expressive level and information about other aspects of the gesture, such as the shape, is not retained. This research is significant to social interaction between agents and humans, for example, in cases where an agent wishes to show empathy with a conversational partner without an exact copying of their motions.