Reactiva'Motion Project: Motion Synthesis Based on a Reactive Representation
GW '99 Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction
Providing signed content on the Internet by synthesized animation
ACM Transactions on Computer-Human Interaction (TOCHI)
A Virtual Reality-Based Framework for Experiments on Perception of Manual Gestures
Gesture-Based Human-Computer Interaction and Simulation
Using signing space as a representation for sign language processing
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Requirements for a gesture specification language: a comparison of two representation formalisms
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
The illusion of robotic life: principles and practices of animation for robots
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Hi-index | 0.00 |
This paper describes a complete system for the specification and the generation of communication gestures. A high level language for the specification of hand-arm communication gestures has been developed. This language is based both on a discrete description of space, and on a movement decomposition inspired from sign language gestures. Communication gestures are represented through symbolic commands which can be described by qualitative data, and traduced in terms of spatio-temporal targets driving a generation system. Such an approach is possible for the class of generation models controlled through key-points information. The generation model used in our approach is composed of a set of sensori-motor servo-loops. Each of these models resolves in real time the inversion of the servo-loop, from the direct specification of location targets, while satisfying psycho-motor laws of biological movement. The whole control system is applied to the synthesis of communication and sign language gestures, and a validation of the synthesized movements is presented.