Improved Face and Hand Tracking for Sign Language Recognition

  • Authors:
  • N. Soontranon;S. Aramvith;T. H. Chalidabhongse

  • Affiliations:
  • Chulalongkorn University, Thailand;Chulalongkorn University, Thailand;King Mongkut's Institute of Technology Ladkrabang, Thailand

  • Venue:
  • ITCC '05 Proceedings of the International Conference on Information Technology: Coding and Computing (ITCC'05) - Volume II - Volume 02
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop face and hand tracking for sign language recognition system. The system is divided into two stages; the initial and tracking stages. In initial stage, we use the skin feature to localize face and hands of signer. The ellipse model on CbCr space is constructed and used to detect skin color. After the skin regions have been segmented, face and hand blobs are defined by using size and facial feature with the assumption that the movement of face is less than that of hands in this signing scenario. In tracking stage, the motion estimation is applied only hand blobs, in which first and second derivative are used to compute the position of prediction of hands. We observed that there are errors in the value of tracking position between two consecutive frames in which velocity has changed abruptly. To improve the tracking performance, our proposed algorithm compensates the error of tracking position by using adaptive search area to re-compute the hand blobs. Simulation results indicate our proposed algorithm can track face and hand with greater precision with negligible computational complexity increase.