CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Smart Cameras as Embedded Systems
Computer
Towards Robust Multi-cue Integration for Visual Tracking
ICVS '01 Proceedings of the Second International Workshop on Computer Vision Systems
Real-Time Video Analysis on an Embedded Smart Camera for Traffic Surveillance
RTAS '04 Proceedings of the 10th IEEE Real-Time and Embedded Technology and Applications Symposium
Democratic Integration: Self-Organized Integration of Adaptive Cues
Neural Computation
Hardware/Software Co-Design of an FPGA-based Embedded Tracking System
CVPRW '06 Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop
Adaptive probabilistic tracking embedded in smart cameras for distributed surveillance in a 3D model
EURASIP Journal on Embedded Systems
Customizing multiprocessor implementation of an automated video surveillance system
EURASIP Journal on Embedded Systems
Proceedings of the conference on Design, automation and test in Europe
A tutorial on particle filters for online nonlinear/non-GaussianBayesian tracking
IEEE Transactions on Signal Processing
Symbolic design space exploration for multi-mode reconfigurable systems
CODES+ISSS '11 Proceedings of the seventh IEEE/ACM/IFIP international conference on Hardware/software codesign and system synthesis
Placing multimode streaming applications on dynamically partially reconfigurable architectures
International Journal of Reconfigurable Computing - Special issue on Selected Papers from the International Conference on Reconfigurable Computing and FPGAs (ReConFig'10)
Hi-index | 0.00 |
Computer vision is one of the key research topics of modern computer science and finds application in manufacturing, surveillance, automotive, robotics, and sophisticated human-machine-interfaces. These applications require small and efficient solutions which are commonly provided as embedded systems. This means that there exist resource constraints, but also the need for increasing adaptivity and robustness. This paper proposes an autonomic computing framework for robust object tracking. A probabilistic tracking algorithm is combined with the use of multi-filter fusion of redundant image filters. The system can react on unpredictable changes in the environment through self-adaptation. Due to resource constraints, the number of filters actively used for tracking is limited. By means of self-organization, the system structure is re-organized to activate filters adequate for the current context. The proposed framework is designed for, but not limited to, embedded computer vision. Experimental evaluations demonstrate the benefit of the approach.