Hand detection and feature extraction for static Thai Sign Language recognition

  • Authors:
  • Thanadej Suksil;Thanarat H. Chalidabhongse

  • Affiliations:
  • Chulalongkorn University, Bangkok, Thailand;Chulalongkorn University, Bangkok, Thailand

  • Venue:
  • Proceedings of the 7th International Conference on Ubiquitous Information Management and Communication
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a method to detect and extract hand features from video sequences, where a person performs Thai Sign Language (TSL), for recognizing static TSL alphabets. First, the skin regions are segmented using trained skin color model represented in YCbCr color space. Next, Haar-like feature is used to label face and hands' initial positions for further tracking. In tracking, object hypothesis and template matching are employed to track face and hands even when occlusion occurs. The motion and shape of hands are used to determine gesture state and to extract sign key frames. In order to recognize TSL alphabets, first the hand postures are classified into groups using the convexity defect points of hand shape. Then, the Hu moments of hand shape are matched within group using K-nearest neighbor. Results are shown for several videos of a professional TSL interpreter signing 42 TSL alphabets.