Understanding hand gestures using approximate graph matching

  • Authors:
  • B. W. Miners;O. A. Basir;M. S. Kamel

  • Affiliations:
  • Dept. of Syst. Design Eng., Univ. of Waterloo, Ont., Canada;-;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We live in a society that depends on high-tech devices for assistance with everyday tasks, including everything from transportation to health care, communication, and entertainment. Tedious tactile input interfaces to these devices result in inefficient use of our time. Appropriate use of natural hand gestures will result in more efficient communication if the underlying meaning is understood. Overcoming natural hand gesture understanding challenges is vital to meet the needs of these increasingly pervasive devices in our every day lives. This work presents a graph-based approach to understand the meaning of hand gestures by associating dynamic hand gestures with known concepts and relevant knowledge. Conceptual-level processing is emphasized to robustly handle noise and ambiguity introduced during generation, data acquisition, and low-level recognition. A simple recognition stage is used to help relax scalability limitations of conventional stochastic language models. Experimental results show that this graph-based approach to hand gesture understanding is able to successfully understand the meaning of ambiguous sets of phrases consisting of three to five hand gestures. The presented approximate graph-matching technique to understand human hand gestures supports practical and efficient communication of complex intent to the increasingly pervasive high-tech devices in our society.