A Complete System for the Specification and the Generation of Sign Language Gestures

  • Authors:
  • Thierry Lebourque;Sylvie Gibet

  • Affiliations:
  • -;-

  • Venue:
  • GW '99 Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a system called GeSsyCa which is able to produce synthetic sign language gestures from a high level specification. This specification is made with a language based both on a discrete description of space, and on a movement decomposition inspired from sign language gestures. Communication gestures are represented through symbolic commands which can be described by qualitative data, and traduced in terms of spatio-temporal targets driving a generation system. Such an approach is possible for the class of generation models controlled through key-points information. The generation model used in our approach is composed of a set of sensori-motor servo-loops. Each of these models resolves in real time the inversion of the servo-loop, from the direct specification of location targets, while satisfying psycho-motor laws of biological movement. The whole control system is applied to the synthesis of communication and sign language gestures, and a validation of the synthesized movements is presented.