Modeling the constraints of human hand motion

  • Authors:
  • John Lin;Ying Wu;T. S. Huang

  • Affiliations:
  • -;-;-

  • Venue:
  • HUMO '00 Proceedings of the Workshop on Human Motion (HUMO'00)
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hand motion capture is one of the most important parts of gesture interfaces. Many current approaches to this task generally involve a formidable nonlinear optimization problem in a large search space. Motion capture can be achieved more cost-efficiently when considering the motion constraints of a hand. Although some constraints can be represented as equalities or inequalities, there exist many constraints which cannot be explicitly represented. In this paper, we propose a learning approach to model the hand configuration space directly. The redundancy of the configuration space can be eliminated by finding a lower-dimensional subspace of the original space. Finger motion is modeled in this subspace based on the linear behavior observed in the real motion data collected by a CyberGlove. Employing the constrained motion model, we are able to efficiently capture finger motion from video inputs. Several experiments show that our proposed model is helpful for capturing articulated motion.