Visual tracking in continuous appearance space via sparse coding

  • Authors:
  • Guofeng Wang;Fan Zhong;Yue Liu;Qunsheng Peng;Xueying Qin

  • Affiliations:
  • School of Compute Science and Technology, Shandong University, Jinan, China;School of Compute Science and Technology, Shandong University, Jinan, China;Academy of Mathematics and Systems Science, CAS, Beijing, China;State Key Lab of CAD&CG, Zhejiang University, Hangzhou, China;School of Compute Science and Technology, Shandong University, Jinan, China

  • Venue:
  • ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part III
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Particle Filter is the most widely used framework for object tracking. Despite its advantages in handling complex cases, the discretization of the object appearance space makes it difficult to search the solution efficiently, and the number of particles is also greatly limited in consideration of computational cost, especially for some time-consuming object representations, e.g. sparse representation. In this paper, we propose a novel tracking method in which the appearance space is relaxed to be continuous, the solution then can be searched efficiently via sparse coding iteratively. As particle filter, our method can be combined with many generic tracking methods; typically, we adopt ℓ1 tracker, and demonstrate that with our method both its efficiency and accuracy can be improved in comparison to the version based on particle filter. Another advantage of our method is that it can handle dynamic change of object appearance by adaptively updating the object template model using the learned dictionary, and at the same time can avoid drifting by using representation error for supervision. Our method thus can perform more robust than previous methods in dynamic scenes of gradual changes. Both qualitative and quantitative evaluations demonstrate the efficiency and robustness of the proposed method.