Deictic and emotive communication in animated pedagogical agents
Embodied conversational agents
Performative facial expressions in animated faces
Embodied conversational agents
Coordination and context-dependence in the generation of embodied conversation
INLG '00 Proceedings of the first international conference on Natural language generation - Volume 14
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
3D gesture recognition: an evaluation of user and system performance
Pervasive'11 Proceedings of the 9th international conference on Pervasive computing
On factoring out a gesture typology from the Bielefeld speech-and-gesture-alignment corpus (SAGA)
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Dynamic gesture vocabulary design for intuitive human-robot dialog
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
To move or to remove?: a human-centric approach to understanding gesture interpretation
Proceedings of the Designing Interactive Systems Conference
Hi-index | 0.00 |
A typology of gesture is presented based on four parameters: whether the gesture necessarily occurs with the verbal signal or not, whether it is represented in memory or created anew, how arbitrary or motivated it is, and what type of meaning it conveys. According to the second parameter, gestures are distinguished into codified gestures, ones represented in memory, and creative gestures, ones created on the spot by applying a set of generative rules. On the basis of this typology, a procedure is presented to generate the different types of gestures in a Multimodal Embodied Agent.