DigitEyes: Vision-Based Human Hand Tracking

  • Authors:
  • James M. Rehg;Takeo Kanade

  • Affiliations:
  • -;-

  • Venue:
  • DigitEyes: Vision-Based Human Hand Tracking
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

Passive sensing of human hand and limb motion is important for a wide range of applications from human-computer interaction to athletic performance measurement. High degree of freedom articulated mechanisms like the human hand are difficult to track because of their large state space and complex image appearance. This article describes a model-based hand tracking system, called DigitEyes, that can recover the state of a 27 DOF hand model from gray scale images at speeds of up to 10 Hz. We employ kinematic and geometric hand models, along with a high temporal sampling rate, to decompose global image patterns into incremental, local motions of simple shapes. Hand pose and joint angles are estimated from line and point features extracted from images of unmarked, unadorned hands, taken from one or more viewpoints. We present some preliminary results on a 3D mouse interface based on the DigitEyes sensor.