IMPROVING GESTURE RECOGNITION IN THE ARABIC SIGN LANGUAGE USING TEXTURE ANALYSIS

  • Authors:
  • Omar Al-Jarrah;Faruq A. Al-Omari

  • Affiliations:
  • Department of Computer Engineering, Jordan University of Science and Technology, Irbid, Jordan;Department of Computer Engineering, Yarmouk University, Irbid, Jordan

  • Venue:
  • Applied Artificial Intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Sign language plays a crucial role in communication between people when voices cannot reach them. Deaf people use sign language as their primary method of communication. Hand gestures represent the alphabets of sign languages. For proper inter-communication between hearing and deaf people, a translator becomes of great need. In this paper, a fully automated translator of the gestures representing the alphabets of the Arabic Sign Language (ASL) was developed. A set of 30 ANFIS networks were designed and trained properly to recognize the ASL gestures. The developed system is a visual-based system that does not rely on the use of gloves or visual markings. To this end, the developed system deals with images of bare hands, allowing the user to interact with the system in a natural way. A twin approach that is based on boundary and region properties is utilized to extract a set that recognizes the gesture. The extracted features are translation, scaling, and rotation invariant so as to make the system more flexible. The subtractive clustering algorithm and the least-squares estimator are used to identify the fuzzy inference system, and the training is achieved using the hybrid learning algorithm. Experiments revealed that our system was able to recognize the 30 Arabic manual alphabets with a recognition rate of 100% when approximately 19 rules are used per ANFIS model, and a recognition rate of 97.5% when approximately 10 rules are used.