CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Real-time tracking of image regions with changes in geometry and illumination
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
Object Recognition with Features Inspired by Visual Cortex
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Covariance Tracking using Model Update Based on Lie Algebra
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Learning for Robust Visual Tracking
International Journal of Computer Vision
Object Class Recognition and Localization Using Sparse Features with Limited Receptive Fields
International Journal of Computer Vision
A tutorial on particle filters for online nonlinear/non-GaussianBayesian tracking
IEEE Transactions on Signal Processing
Robust patch-based visual tracking using biologically inspired model
Proceedings of the 4th International Conference on Internet Multimedia Computing and Service
A survey of appearance models in visual object tracking
ACM Transactions on Intelligent Systems and Technology (TIST) - Survey papers, special sections on the semantic adaptive social web, intelligent systems for health informatics, regular papers
Hi-index | 0.00 |
We address the problem of robust appearance-based visual tracking. First, a set of simplified biologically inspired features (SBIF) is proposed for object representation and the Bhattacharyya coefficient is used to measure the similarity between the target model and candidate targets. Then, the proposed appearance model is combined into a Bayesian state inference tracking framework utilizing the SIR (sampling importance resampling) particle filter to propagate sample distributions over time. Numerous experiments are conducted and experimental results demonstrate that our algorithm is robust to partial occlusions and variations of illumination and pose, resistent to nearby distractors, as well as possesses the state-of-the-art tracking accuracy.