Influence of handshape information on automatic sign language recognition

  • Authors:
  • Gineke A. ten Holt;Marcel J. T. Reinders;Emile A. Hendriks;Huib de Ridder;Andrea J. van Doorn

  • Affiliations:
  • Information and Communication Theory Group, Delft University of Technology, Delft, The Netherlands;Information and Communication Theory Group, Delft University of Technology, Delft, The Netherlands;Information and Communication Theory Group, Delft University of Technology, Delft, The Netherlands;Department of Industrial Design, Delft University of Technology, Delft, The Netherlands;Department of Industrial Design, Delft University of Technology, Delft, The Netherlands

  • Venue:
  • GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Research on automatic sign language recognition (ASLR) has mostly been conducted from a machine learning perspective. We propose to implement results from human sign recognition studies in ASLR. In a previous study it was found that handshape is important for human sign recognition. The current paper describes the implementation of this conclusion: using handshape in ASLR. Handshape information in three different representations is added to an existing ASLR system. The results show that recognition improves, except for one representation. This refutes the idea that extra (handshape) information will always improve recognition. Results also vary per sign: some sign classifiers improve greatly, others are unaffected, and rare cases even show decreased performance. Adapting classifiers to specific sign types could be the key for future ASLR.