Articulated-body tracking through anisotropic edge detection

  • Authors:
  • David Knossow;Joost van de Weijer;Radu Horaud;Rémi Ronfard

  • Affiliations:
  • INRIA Rhône-Alpes, Montbonnot, France;INRIA Rhône-Alpes, Montbonnot, France;INRIA Rhône-Alpes, Montbonnot, France;INRIA Rhône-Alpes, Montbonnot, France

  • Venue:
  • WDV'05/WDV'06/ICCV'05/ECCV'06 Proceedings of the 2005/2006 international conference on Dynamical vision
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the problem of articulated motion tracking from image sequences. We describe a method that relies on both an explicit parameterization of the extremal contours and on the prediction of the human boundary edges in the image. We combine extremal contour prediction and edge detection in a non linear minimization process. The error function that measures the discrepancy between observed image edges and predicted model contours is minimized using an analytical expression of the Jacobian that maps joint velocities onto extremal contour velocities. In practice, we model people both by their geometry (truncated elliptic cones) and their articulated structure - a kinematic model with 40 rotational degrees of freedom. To overcome the flaws of standard edge detection, we introduce a model-based anisotropic Gaussian filter. The parameters of the anisotropic Gaussian are automatically derived from the kinematic model through the prediction of the extremal contours. The theory is validated by performing full body motion capture from six synchronized video sequences at 30 fps without markers.