Australian sign language recognition

  • Authors:
  • Eun-Jung Holden;Gareth Lee;Robyn Owens

  • Affiliations:
  • School of Computer Science & Software Engineering, The University of Western Australia, 35 Stirling Highway, 6009, Crawley, WA, Australia;School of Engineering Science, Murdoch University, 35 Stirling Highway, 6168, Rockingham, WA, Australia;School of Computer Science & Software Engineering, The University of Western Australia, 35 Stirling Highway, 6009, Crawley, WA, Australia

  • Venue:
  • Machine Vision and Applications
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an automatic Australian sign language (Auslan) recognition system, which tracks multiple target objects (the face and hands) throughout an image sequence and extracts features for the recognition of sign phrases. Tracking is performed using correspondences of simple geometrical features between the target objects within the current and the previous frames. In signing, the face and a hand of a signer often overlap, thus the system needs to segment these for the purpose of feature extraction. Our system deals with the occlusion of the face and a hand by detecting the contour of the foreground moving object using a combination of motion cues and the snake algorithm. To represent signs, features that are invariant to scaling, 2D rotations and signing speed are used for recognition. The features represent the relative geometrical positioning and shapes of the target objects, as well as their directions of motion. These are used to recognise Auslan phrases using Hidden Markov Models. Experiments were conducted using 163 test sign phrases with varying grammatical formations. Using a known grammar, the system achieved over 97% recognition rate on a sentence level and 99% success rate at a word level.