Georgia tech gesture toolkit: supporting experiments in gesture recognition

  • Authors:
  • Tracy Westeyn;Helene Brashear;Amin Atrash;Thad Starner

  • Affiliations:
  • Georgia Institute of Technology, Atlanta, GA;Georgia Institute of Technology, Atlanta, GA;Georgia Institute of Technology, Atlanta, GA;Georgia Institute of Technology, Atlanta, GA

  • Venue:
  • Proceedings of the 5th international conference on Multimodal interfaces
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gesture recognition is becoming a more common interaction tool in the fields of ubiquitous and wearable computing. Designing a system to perform gesture recognition, however, can be a cumbersome task. Hidden Markov models (HMMs), a pattern recognition technique commonly used in speech recognition, can be used for recognizing certain classes of gestures. Existing HMM toolkits for speech recognition can be adapted to perform gesture recognition, but doing so requires significant knowledge of the speech recognition literature and its relation to gesture recognition. This paper introduces the Georgia Tech Gesture Toolkit GT2k which leverages Cambridge University's speech recognition toolkit, HTK, to provide tools that support gesture recognition research. GT2k provides capabilities for training models and allows for both real--time and off-line recognition. This paper presents four ongoing projects that utilize the toolkit in a variety of domains.