Fusion of Intensity, Texture, and Color in Video Tracking Based on Mutual Information

  • Authors:
  • Joseph L. Mundy;Chung-Fu Chang

  • Affiliations:
  • Brown University;Lockheed Martin Integrated Systems and Solutions, Goodyear, Arizona

  • Venue:
  • AIPR '04 Proceedings of the 33rd Applied Imagery Pattern Recognition Workshop
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Next-generation reconnaissance systems (NGRS) will offer dynamic tasking of a menu of sensor modalities such as video, multi/hyper-spectral and polarization data. A key issue is how best to exploit these modes in timecritical scenarios such as target tracking and event detection. It is essential to be able to represent diverse sensor content in a unified measurement space so that the contribution of each modality can be evaluated in terms of its contribution to the exploitation task. In this paper, mutual information is used to represent the content of individual sensor channels. A series of experiments on video tracking have been carried out to demonstrate the effectiveness of mutual information as a fusion framework. These experiments quantify the relative information content of intensity, color, and polarization image channels.