Interacting Multiple Model Particle Filter to Adaptive Visual Tracking

  • Authors:
  • Jianyu Wang;Debin Zhao;Wen Gao;Shiguang Shan

  • Affiliations:
  • Harbin Institute of Technology;Harbin Institute of Technology and Chinese Academy of Sciences;Harbin Institute of Technology and Chinese Academy of Sciences;Harbin Institute of Technology and Chinese Academy of Sciences

  • Venue:
  • ICIG '04 Proceedings of the Third International Conference on Image and Graphics
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Visual tracking could be formulated as a state estimation problem of target representation based on observations in image sequences. Approaching visual tracking problem in the Bayesian filter framework, how to sample the state evolution model to generate hypothesis of high confidence level is a critical factor. In this paper, we introduce an Interacting Multiple Model Estimation (IMME) framework for adaptive visual tracking. The essence of the IMME framework is that the state is estimated by integrating several different models in parallel and by interacting among those modelsý estimates probabilistically. Based on the IMME framework, we propose a new variation of particle filter named Interacting Multiple Model Particle Filter (IMMPF), in which the hypotheses can be sampled from several different state evolution models adaptively. Experiments show that, when compared with the standard particle filter, the IMMPF generates better hypotheses resulting in better tracking results, especially when the target behaves along several motion modes randomly.