GART: the gesture and activity recognition toolkit

  • Authors:
  • Kent Lyons;Helene Brashear;Tracy Westeyn;Jung Soo Kim;Thad Starner

  • Affiliations:
  • College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA;College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA;College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA;College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA;College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA

  • Venue:
  • HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications.