Sequential particle generation for visual tracking

  • Authors:
  • Yuanwei Lao;Junda Zhu;Yuan F. Zheng

  • Affiliations:
  • Department of Electrical and Computer Engineering, Ohio State University, Columbus, OH;Department of Electrical and Computer Engineering, Ohio State University, Columbus, OH;Department of Electrical and Computer Engineering, Ohio State University, Columbus, OH and Shanghai Jiao Tong University, Shanghai, China

  • Venue:
  • IEEE Transactions on Circuits and Systems for Video Technology
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel probabilistic tracking system is presented, which includes a sequential particle sampler and a fragment-based measurement model. Rather than generating particles independently in a generic particle filter, the correlation between particles is used to improve sampling efficieney, especially when the target moves in an unexpected and abrupt fashion. We propose to update the proposal distribution by dynamically incorporating the most recent measurements and generating particles sequentially, where the contextual confidence of the user on the measurement model is also involved. Besides, the matching template is divided into non-overlapping fragments, and by learning the background information only a subset of the most discriminative target regions are dynamically selected to measure each particle, where the model update is easily embedded to handle fast appearance cbanges. The two parts are dynamically fused together such that the system is able to capture abrupt motions and produce a better localization of the moving target in an efficient way. With the improved discriminative power, the new algorithm also succeeds in handling partial occlusions and clutter background. Experiments on both synthetic and real-world data verify the effectiveness of the new algorithm and demonstrate its superiority over existing methods.