Robust tracking with weighted online structured learning

  • Authors:
  • Rui Yao;Qinfeng Shi;Chunhua Shen;Yanning Zhang;Anton van den Hengel

  • Affiliations:
  • School of Computer Science, Northwestern Polytechnical University, China, School of Computer Science, The University of Adelaide, Australia;School of Computer Science, The University of Adelaide, Australia;School of Computer Science, The University of Adelaide, Australia;School of Computer Science, Northwestern Polytechnical University, China;School of Computer Science, The University of Adelaide, Australia

  • Venue:
  • ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part III
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Robust visual tracking requires constant update of the target appearance model, but without losing track of previous appearance information. One of the difficulties with the online learning approach to this problem has been a lack of flexibility in the modelling of the inevitable variations in target and scene appearance over time. The traditional online learning approach to the problem treats each example equally, which leads to previous appearances being forgotten too quickly and a lack of emphasis on the most current observations. Through analysis of the visual tracking problem, we develop instead a novel weighted form of online risk which allows more subtlety in its representation. However, the traditional online learning framework does not accommodate this weighted form. We thus also propose a principled approach to weighted online learning using weighted reservoir sampling and provide a weighted regret bound as a theoretical guarantee of performance. The proposed novel online learning framework can handle examples with different importance weights for binary, multiclass, and even structured output labels in both linear and non-linear kernels. Applying the method to tracking results in an algorithm which is both efficient and accurate even in the presence of severe appearance changes. Experimental results show that the proposed tracker outperforms the current state-of-the-art.