Implementing expressive gesture synthesis for embodied conversational agents
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
This work aims at establishing a more variable, and thus attractive, communication with humanoids. For this purpose we developed a method of dynamically expressing emotions and intentions associated with parametrised gestures. First, gestures and possible parameters, which are necessary for the generation of a whole-body gesture set, were analyzed. A gesture's inner and outer expressivity is thereby defined which leads to the differentiation of single gestures and variable gestural expressions. Gestures are subsequently classified into (feedback) categories and related depending on their expressivity. We developed a gesture set consisting of six categories including over 50 variable gestures. As proof of concept, the gesture set has been implemented to allow for an easy and flexible authoring process of gesture-supported communication.