Deciphering gestures with layered meanings and signer adaptation

  • Authors:
  • Sylvie C. W. Ong;Surendra Ranganath

  • Affiliations:
  • Dept of Electrical and Computer Engineering, National University of Singapore, Singapore;Dept of Electrical and Computer Engineering, National University of Singapore, Singapore

  • Venue:
  • FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Grammatical information conveyed through systematic temporal and spatial movement modifications is an integral aspect of sign language communication. We propose to model these systematic variations as simultaneous channels of information. Classification results at the channel level are output to Bayesian Networks which recognize both the basic gesture meaning and the grammatical information (here refered to as layered meanings).With a simulated vocabulary of 6 basic signs and 5 possible layered meanings, test data for eight test subjects was recognized with 85.0% accuracy. We also adapt a system trained on three test subjects to recognize gesture data from a fourth person, based on a small set of adaptation data. We obtained gesture recognition accuracy of 88.5% which is a 75.7% reduction in error rate as compared to the unadapted system.