EigenTracking: Robust Matching and Tracking of Articulated Objects Using a View-Based Representation

  • Authors:
  • Michael J. Black;Allan D. Jepson

  • Affiliations:
  • Xerox Palo Alto Research Center, 3333 Coyote Hill Road, Palo Alto, CA 94304. E-mail: black@parc.xerox.com;Department of Computer Science, University of Toronto, Toronto, Ontario M5S 3H5 Canada. E-mail: jepson@vis.toronto.edu

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an approach for tracking rigid and articulated objects using a view-based representation.The approach builds on and extends work on eigenspace representations,robust estimation techniques, and parameterized optical flow estimation.First, we note that the least-squares image reconstruction of standard eigenspace techniques has a number of problems and we reformulate the reconstructionproblem as one of robust estimation.Second we define a “subspace constancy assumption” that allows us to exploit techniques for parameterized optical flow estimation to simultaneously solve for the view of an object and theaffine transformation between the eigenspace and the image.To account for large affine transformations between the eigenspace andthe image we define a multi-scale eigenspace representation and a coarse-to-finematching strategy. Finally, we use these techniques to track objects over long image sequencesin which the objects simultaneously undergo both affine image motions and changes of view.In particular we use this “EigenTracking” technique to track and recognize the gestures of a moving hand.