Area-time optimal VLSI integer multiplier with minimum computation time
Information and Control
Number by colors: a guide to using color to understand technical data
Number by colors: a guide to using color to understand technical data
The image processing handbook (3rd ed.)
The image processing handbook (3rd ed.)
Communications of the ACM
Embedded Everywhere: A Research Agenda for Networked Systems of Embedded Computers
Embedded Everywhere: A Research Agenda for Networked Systems of Embedded Computers
Digital Image Restoration
Analysis and FPGA Implementation of Image Restoration under Resource Constraints
IEEE Transactions on Computers
Real-Time 2-D Feature Detection on a Reconfigurable Computer
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Collaborative and reconfigurable object tracking
The Journal of Supercomputing
Collaborative and reconfigurable object tracking
The Journal of Supercomputing
Designing a Posture Analysis System with Hardware Implementation
Journal of VLSI Signal Processing Systems
Human tracking: a state-of-art survey
KES'10 Proceedings of the 14th international conference on Knowledge-based and intelligent information and engineering systems: Part II
On incremental component implementation selection in system synthesis
IEEE Transactions on Very Large Scale Integration (VLSI) Systems
Hi-index | 0.00 |
Many Applications perceive visual information through networks of embedded sensors. Intensive image processing computations have to be performed in order to process the perceived information. Such computations usually demand hardware implementations in order to exhibit real time performance. Furthermore, many of such applications are hard to be characterized a priori, since they take different paths according to events happening in the scene at runtime. Hence, reconfigurable hardware devices are the only viable platform for implementing such applications, providing both real time performance and dynamic adaptability for the system.In this paper, we present a collaborative and dynamically adaptive object tracking system that has been built in our lab. We exploit reconfigurable hardware devices embedded in a number of networked cameras in order to achieve our goal. We justify the need for dynamic adaptation of the system through scenarios and applications. Experimental results on a set of scenes advocate the fact that our system works effectively for different scenario of events through reconfiguration. Comparing results with non-adaptive implementations verify the fact that our approach improves system's robustness to scene variations and outperforms the traditional implementations.