Taiwan sign language (TSL) recognition based on 3D data and neural networks

  • Authors:
  • Yung-Hui Lee;Cheng-Yueh Tsai

  • Affiliations:
  • Department of Industrial Management, National Taiwan University of Science and Technology, #43, Section 4, Keelung Road, Taipei 106, Taiwan, ROC;Department of Industrial Management, National Taiwan University of Science and Technology, #43, Section 4, Keelung Road, Taipei 106, Taiwan, ROC

  • Venue:
  • Expert Systems with Applications: An International Journal
  • Year:
  • 2009

Quantified Score

Hi-index 12.06

Visualization

Abstract

This study proposes a system for recognizing static gestures in Taiwan sign languages (TSL), using 3D data and neural networks trained to completion. The 3D hand gesture data is acquired from VICON, and are processed and converted to features that are fed to a neural network. The extracted features are invariant for occlusion, rotation, scaling and translation of hand. Experimental results indicate that the proposed system can recognize the 20 static hand gestures in Taiwan sign language with an average accuracy of 96.58%. Additionally, the difference in recognition rate between training set and test set is 3.98%, showing that the proposed system is robust.