Recognition of gestures in Arabic sign language using neuro-fuzzy systems

  • Authors:
  • Omar Al-Jarrah;Alaa Halawani

  • Affiliations:
  • Jordan Univ. of Science and Technology, Irbid, Jordan;Jordan Univ. of Science and Technology, Irbid, Jordan

  • Venue:
  • Artificial Intelligence
  • Year:
  • 2001

Quantified Score

Hi-index 0.01

Visualization

Abstract

Hand gestures play an important role in communication between people during their daily lives. But the extensive use of hand gestures as a mean of communication can be found in sign languages. Sign language is the basic communication method between deaf people. A translator is usually needed when an ordinary person wants to communicate with a deaf one. The work presented in this paper aims at developing a system for automatic translation of gestures of the manual alphabets in the Arabic sign language. In doing so, we have designed a collection of ANFIS networks, each of which is trained to recognize one gesture. Our system does not rely on using any gloves or visual markings to accomplish the recognition job. Instead, it deals with images of bare hands, which allows the user to interact with the system in a natural way. An image of the hand gesture is processed and converted into a set of features that comprises of the lengths of some vectors which are selected to span the fingertips' region. The extracted features are rotation, scale, and translation invariat, which makes the system more flexible. The subtractive clustering algorithm and the least-squares estimator are used to identify the fuzzy inference system, and the training is achieved using the hybrid learning algorithm. Experiments revealed that our system was able to recognize the 30 Arabic manual alphabets with an accuracy of 93.55%.