A multiscale approach to pixel-level image fusion

  • Authors:
  • A. Ben Hamza;Yun He;Hamid Krim;Alan Willsky

  • Affiliations:
  • Concordia Institute for Information Systems Engineering, Concordia University, Montréal, Quebec, Canada H3G 1T7. E-mail: hamza@ciise.concordia.ca;Tality Corporation, Cary, NC 27511, USA. E-mail: yhe@tality.com;Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7914, USA. E-mail: ahk@ncsu.edu;Laboratory for Information and Decision Systems, Massachusetts Institute of Technology, Cambridge, MA 02139-4307, USA. E-mail: willsky@mit.edu

  • Venue:
  • Integrated Computer-Aided Engineering
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Pixel level image fusion refers to the processing and synergistic combination of information gathered by various imaging sources to provide a better understanding of a scene. We formulate the image fusion as an optimization problem and propose an information theoretic approach in a multiscale framework to obtain its solution. A biorthogonal wavelet transform of each source image is first calculated, and a new Jensen-Rényi divergence-based fusion algorithm is developed to construct composite wavelet coefficients according to the measurement of the information patterns inherent in the source images. Experimental results on fusion of multi-sensor navigation images, multi-focus optical images, multi-modality medical images and multi-spectral remote sensing images are presented to illustrate the proposed fusion scheme.