Collaborative and reconfigurable object tracking

  • Authors:
  • Soheil Ghiasi;Hyun J. Moon;Ani Nahapetian;Majid Sarrafzadeh

  • Affiliations:
  • Computer Science Department, University of California, Los Angeles;Computer Science Department, University of California, Los Angeles;Computer Science Department, University of California, Los Angeles;Computer Science Department, University of California, Los Angeles

  • Venue:
  • The Journal of Supercomputing
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many Applications perceive visual information through networks of embedded sensors. Intensive image processing computations have to be performed in order to process the perceived information. Such computations usually demand hardware implementations in order to exhibit real time performance. Furthermore, many of such applications are hard to be characterized a priori, since they take different paths according to events happening in the scene at runtime. Hence, reconfigurable hardware devices are the only viable platform for implementing such applications, providing both real time performance and dynamic adaptability for the system.In this paper, we present a collaborative and dynamically adaptive object tracking system that has been built in our lab. We exploit reconfigurable hardware devices embedded in a number of networked cameras in order to achieve our goal. We justify the need for dynamic adaptation of the system through scenarios and applications. Experimental results on a set of scenes advocate the fact that our system works effectively for different scenario of events through reconfiguration. Comparing results with non-adaptive implementations verify the fact that our approach improves system's robustness to scene variations and outperforms the traditional implementations.