2D Articulated Tracking with Dynamic Bayesian Networks

  • Authors:
  • Chunhua Shen;Anton van den Hengel;Anthony Dick;Michael J. Brooks

  • Affiliations:
  • University of Adelaide;University of Adelaide;University of Adelaide;University of Adelaide

  • Venue:
  • CIT '04 Proceedings of the The Fourth International Conference on Computer and Information Technology
  • Year:
  • 2004

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a novel method for tracking the motion of an articulated structure in a video sequence. The analysis of articulated motion is challenging because of the potentially large number of degrees of freedom (DOFs) of an articulated body. For particle filter based algorithms, the number of samples required with high dimensional problems can be computationally prohibitive. To alleviate this problem, we represent the articulated object as an undirected graphical model (or Markov Random Field, MRF) in which soft constraints between adjacent subparts are captured by conditional probability distributions. The graphical model is extended across time frames to implement a tracker. The tracking algorithm can be interpreted as a belief inference procedure on a dynamic Bayesian network. The discretisation of the state vectors makes it possible to utilise the efficient belief propagation (BP) and mean field (MF) algorithms to reason in this network. Experiments on real video sequences demonstrate that the proposed method is computationally efficient and performs well in tracking the human body.