Tracking nonstationary visual appearances by data-driven adaptation

  • Authors:
  • Ming Yang;Zhimin Fan;Jialue Fan;Ying Wu

  • Affiliations:
  • NEC laboratories America, Inc., Cupertino, CA;Electrical Engineering and Computer Science Department, Northwestern University, Evanston, IL;Electrical Engineering and Computer Science Department, Northwestern University, Evanston, IL;Electrical Engineering and Computer Science Department, Northwestern University, Evanston, IL

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

Without any prior about the target, the appearance is usually the only cue available in visual tracking. However, in general, the appearances are often nonstationary which may ruin the predefined visual measurements and often lead to tracking failure in practice. Thus, a natural solution is to adapt the observation model to the nonstationary appearances. However, this idea is threatened by the risk of adaptation drift that originates in its ill-posed nature, unless good data-driven constraints are imposed. Different from most existing adaptation schemes, we enforce three novel constraints for the optimal adaptation: 1) negative data, 2) bottom-up pair-wise data constraints, and 3) adaptation dynamics. Substantializing the general adaptation problem as a subspace adaptation problem, this paper presents a closed-form solution as well as a practical iterative algorithm for subspace tracking. Extensive experiments have demonstrated that the proposed approach can largely alleviate adaptation drift and achieve better tracking results for a large variety of nonstationary scenes.