Multiple input to multiple output images fusion based on turbo iteration

  • Authors:
  • Chu He;Meng-ling Liu;Na Li;Hong Sun

  • Affiliations:
  • Signal Processing Laboratory, Department of Communication Engineering, School of Electronic Information, Wuhan University, Wuhan, China;Signal Processing Laboratory, Department of Communication Engineering, School of Electronic Information, Wuhan University, Wuhan, China;Signal Processing Laboratory, Department of Communication Engineering, School of Electronic Information, Wuhan University, Wuhan, China;Signal Processing Laboratory, Department of Communication Engineering, School of Electronic Information, Wuhan University, Wuhan, China

  • Venue:
  • EURASIP Journal on Advances in Signal Processing - Special issue on advances in multidimensional synthetic aperture radar signal processing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper mainly addresses the problem of multipolar Synthetic Aperture Radar (SAR) and colorful optical images fusion by regarding them as multichannel images. Based on traditional wavelet-based and model-based fusion algorithms, the paper proposes a multi-channel image fusion algorithm based on a multi-multiturbo iterative method. Multi-multiframe is proposed to represent the original image information with multiple outputs from better information-separating viewpoints, and turbo iterative balances wavelet-based and model-based fusion. The approach is designed in this manner. First, Intensity-Hue-Saturation (IHS) transformation is applied to the SAR and optical images. Then, different fusion processes are used on corresponding components. Fusion based on multi-multi and turbo iterative is applied to the Intensity component whereas weighted fusion is applied to Hue and Saturation components. To get the final result, inverse IHS transformation is applied. Experimental results show that the proposed algorithm performs effectively in preserving useful complementary information between optical and SAR images.