Linguistic properties based on american sign language recognition with artificial neural networks using a sensory glove and motion tracker

  • Authors:
  • Cemil Oz;Ming C. Leu

  • Affiliations:
  • Department of Computer Engineering, University of Sakarya, Turkey;Department of Mechanical and Aerospace Engineering, University of Missouri Rolla

  • Venue:
  • IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Sign language, which is a highly visual-spatial, linguistically complete and natural language, is the main mode of communication among deaf people. In this paper, an American Sign Language (ASL) word recognition system is being developed using artificial neural networks (ANN) to translate the ASL words into English. The system uses a sensory glove CybergloveTMand a Flock of Birds® 3- D motion tracker to extract the gesture features. The finger joint angle data obtained from strain gauges in the sensory glove define the hand-shape while the data from the tracker describe the trajectory of hand movement. The trajectory of hand is normalized for increase of the signer position flexibility. The data from these devices are processed by two neural networks, a velocity network and a word recognition network. The velocity network uses hand speed to determine the duration of words. To convey the meaning of a sign, signs are defined by feature vectors such as hand shape, hand location, orientation, movement, bounding box, and distance. The second network is used as a classifier to convert ASL signs into words based on features. We trained and tested our ANN model for 60 ASL words for different number of samples. Our test results show that the accuracy of recognition is 92% .