Tracking Articulated Body by Dynamic Markov Network

  • Authors:
  • Ying Wu;Gang Hua;Ting Yu

  • Affiliations:
  • -;-;-

  • Venue:
  • ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new method for visual tracking of articulated objectsis presented. Analyzing articulated motion is challengingbecause the dimensionality increase potentially demandstremendous increase of computation. To ease this problem,we propose an approach that analyzes subparts locallywhile reinforcing the structural constraints at the meantime. The computational model of the proposed approachis based on a dynamic Markov network, a generative modelwhich characterizes the dynamics and the image observationsof each individual subpart as well as the motion constraintsamong different subparts. Probabilistic variationalanalysis of the model reveals a mean field approximationto the posterior densities of each subparts given visual evidence,and provides a computationally efficient way forsuch a difficult Bayesian inference problem. In addition,we design mean field Monte Carlo (MFMC) algorithms, inwhich a set of low dimensional particle filters interact witheach other and solve the high dimensional problem collaboratively.Extensive experiments on tracking human bodyparts demonstrate the effectiveness, significance and computationalefficiency of the proposed method.