A Dynamic Bayesian Network Approach to Multi-cue based Visual Tracking

  • Authors:
  • Tao Wang;Qian Diao;Yimin Zhang;Gang Song;Chunrong Lai;Gary Bradski

  • Affiliations:
  • Intel China Research Center, Beijing, P.R. China;Intel China Research Center, Beijing, P.R. China;Intel China Research Center, Beijing, P.R. China;Intel China Research Center, Beijing, P.R. China;Intel China Research Center, Beijing, P.R. China;Intel China Research Center, Beijing, P.R. China

  • Venue:
  • ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Visual tracking has been an active research field of computer vision. However, robust tracking is still far from satisfactory under conditions of various background clutter, poses and occlusion in the real world. To increase reliability, this paper presents a novel Dynamic Bayesian Networks (DBNs) approach to multi-cue based visual tracking. The method first extracts multi-cue observations such as skin color, ellipse shape, face detection, and then integrates them with hidden motion states in a compact DBN model. By using particle-based inference with multiple cues, our method works well even in background clutter without the need to resort to simplified linear and Gaussian assumptions. The experimental results are compared against the widely used CONDENSATION and KF approaches. Our better tracking results along with ease of fusing new cues in the DBN framework suggest that this technique is a fruitful basis to build top performing visual tracking systems.