Robust tracking with and beyond visible spectrum: a four-layer data fusion framework

  • Authors:
  • Jianru Xue;Nanning Zheng

  • Affiliations:
  • Institute of Artificial Intelligence and Robotics, Xi'an Jiaotong University, Xi'an, Shaanxi, China;Institute of Artificial Intelligence and Robotics, Xi'an Jiaotong University, Xi'an, Shaanxi, China

  • Venue:
  • IWICPAS'06 Proceedings of the 2006 Advances in Machine Vision, Image Processing, and Pattern Analysis international conference on Intelligent Computing in Pattern Analysis/Synthesis
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Developing robust visual tracking algorithms for real-world applications is still a major challenge today. In this paper,we focus on robust object tracking with multiple spectrum imaging sensors. We propose a four-layer probabilistic fusion framework for visual tracking with and beyond visible spectrum imaging sensors. The framework consists of four different layers of a bottom-up fusion process. These four layers are defined as: visual cues layer fusing visual modalities via an adaptive fusion strategy, models layer fusing prior motion information via interactive multi-model method(IMM), trackers layer fusing results from multiple trackers via adaptive tracking mode switching, and sensors layer fusing multiple sensors in a distributed way. It requires only state distributions in the input and output of each layer to ensure consistency of so many visual modules within the framework. Furthermore, the proposed framework is general and allows augmenting and pruning of fusing layers according to visual environment at hand. We test the proposed framework in various complex scenarios where a single sensor based tracker may fail, and obtain satisfying tracking results.