Robust Object Tracking Via Online Dynamic Spatial Bias Appearance Models

  • Authors:
  • Datong Chen;Jie Yang

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.14

Visualization

Abstract

This paper presents a robust object tracking method via a spatial bias appearance model learned dynamically in video. Motivated by the attention shifting among local regions of a human vision system during object tracking, we propose to partition an object into regions with difierent confidences and track the object using a dynamic spatial bias appearance model (DSBAM) estimated from region confidences. The confidence of a region is estimated to re ect the discriminative power of the region in a feature space, and the probability of occlusion. We propose a novel hierarchical Monte Carlo (HAMC) algorithm to learn region confidences dynamically in every frame. The algorithm consists of two levels of Monte Carlo processes implemented using two particle filtering procedures at each level and can effciently extract high confidence regions through video frames by exploiting the temporal consistency of region confidences. A dynamic spatial bias map is then generated from the high confidence regions, and is employed to adapt the appearance model of the object and to guide a tracking algorithm in searching for correspondences in adjacent frames of video images. We demonstrate feasibility of the proposed method in video surveillance applications. The proposed method can be combined with many other existing tracking systems to enhance the robustness of these systems.