3-D Real-Time Gesture Recognition Using Proximity Spaces

  • Authors:
  • Eric Huber

  • Affiliations:
  • -

  • Venue:
  • WACV '96 Proceedings of the 3rd IEEE Workshop on Applications of Computer Vision (WACV '96)
  • Year:
  • 1996

Quantified Score

Hi-index 0.00

Visualization

Abstract

A 3-D segment tracking approach to recognition of human pose and gestures is presented. In the past we have developed and refined a stereo based method, called the Proximity Space method, for acquiring and maintaining track of object surfaces in 3-space. This method uses LoG filtered images and relies solely on stereo measurements to spatially distinguish between objects in 3D. The objective of this work is to obtain useful state information about the shape, size, and pose of natural (unadorned) objects in their naturally cluttered environments. Thus, our system does not require nor benefit from special markers, colors, or other tailored artifacts. Recently we have extended this method in order to track multiple regions and segments of complex objects. This paper describes techniques for applying the Proximity Space method to a particularly interesting system: the human. Specifically, we discuss the use of simple models for constraining Proximity Space behavior in order to track gestures as a person moves through a cluttered environment. It is demonstrated that by observing the behavior of the model, used to track the human's pose through time, different gestures can be easily recognized. The approach is illustrated through a discussion of gestures used to provide logical and spatial commands to a mobile robot.