CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Color-Based Probabilistic Tracking
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
IEEE Transactions on Pattern Analysis and Machine Intelligence
Visual Tracking in the Presence of Motion Blur
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Removing camera shake from a single photograph
ACM SIGGRAPH 2006 Papers
ACM Computing Surveys (CSUR)
Image and depth from a conventional camera with a coded aperture
ACM SIGGRAPH 2007 papers
Robust Bayesian tracking on Riemannian manifolds via fragments-based representation
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
Journal of Mathematical Imaging and Vision
Hi-index | 0.00 |
Motion blurs are pervasive in real captured video data, especially for hand-held cameras and smartphone cameras because of their low frame rate and material quality. This paper presents a novel Kernel-based motion-Blurred target Tracking (KBT) approach to accurately locate objects in motion blurred video sequence, without explicitly performing deblurring. To model the underlying motion blurs, we first augment the target model by synthesizing a set of blurred templates from the target with different blur directions and strengths. These templates are then represented by color histograms regularized by an isotropic kernel. To locate the optimal position for each template, we choose to use the mean shift method for iterative optimization. Finally, the optimal region with maximum similarity to its corresponding template is considered as the target. To demonstrate the effectiveness and efficiency of our method, we collect several video sequences with severe motion blurs and compare KBT with other traditional trackers. Experimental results show that our KBT method can robustly and reliably track strong motion blurred targets.