Blurred target tracking by Blur-driven Tracker

  • Authors:
  • Yi Wu; Haibin Ling; Jingyi Yu; Feng Li; Xue Mei; Erkang Cheng

  • Affiliations:
  • Computer & Information Science Department, Temple University, Philadelphia, PA 19122, USA;Computer & Information Science Department, Temple University, Philadelphia, PA 19122, USA;Department of Computer and Information Sciences, University of Delaware, Newark, 19716, USA;Department of Computer and Information Sciences, University of Delaware, Newark, 19716, USA;-;Computer & Information Science Department, Temple University, Philadelphia, PA 19122, USA

  • Venue:
  • ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Visual tracking plays an important role in many computer vision tasks. A common assumption in previous methods is that the video frames are blur free. In reality, motion blurs are pervasive in the real videos. In this paper we present a novel BLUr-driven Tracker (BLUT) framework for tracking motion-blurred targets. BLUT actively uses the information from blurs without performing debluring. Specifically, we integrate the tracking problem with the motion-from-blur problem under a unified sparse approximation framework. We further use the motion information inferred by blurs to guide the sampling process in the particle filter based tracking. To evaluate our method, we have collected a large number of video sequences with significatcant motion blurs and compared BLUT with state-of-the-art trackers. Experimental results show that, while many previous methods are sensitive to motion blurs, BLUT can robustly and reliably track severely blurred targets.