CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Estimation with Applications to Tracking and Navigation
Estimation with Applications to Tracking and Navigation
Towards Improved Observation Models for Visual Tracking: Selective Adaptation
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
Color-Based Probabilistic Tracking
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
Adaptive Window Algorithm with Four-Direction Sizing Factors for Robust Correlation-Based Tracking
ICTAI '97 Proceedings of the 9th International Conference on Tools with Artificial Intelligence
Computer Vision Beyond the Visible Spectrum
Computer Vision Beyond the Visible Spectrum
Layered Representation for Pedestrian Detection and Tracking in Infrared Imagery
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
Mathematical Techniques in Multisensor Data Fusion (Artech House Information Warfare Library)
Mathematical Techniques in Multisensor Data Fusion (Artech House Information Warfare Library)
A probabilistic framework for combining tracking algorithms
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
Developing robust visual tracking algorithms for real-world applications is still a major challenge today. In this paper,we focus on robust object tracking with multiple spectrum imaging sensors. We propose a four-layer probabilistic fusion framework for visual tracking with and beyond visible spectrum imaging sensors. The framework consists of four different layers of a bottom-up fusion process. These four layers are defined as: visual cues layer fusing visual modalities via an adaptive fusion strategy, models layer fusing prior motion information via interactive multi-model method(IMM), trackers layer fusing results from multiple trackers via adaptive tracking mode switching, and sensors layer fusing multiple sensors in a distributed way. It requires only state distributions in the input and output of each layer to ensure consistency of so many visual modules within the framework. Furthermore, the proposed framework is general and allows augmenting and pruning of fusing layers according to visual environment at hand. We test the proposed framework in various complex scenarios where a single sensor based tracker may fail, and obtain satisfying tracking results.