Multisensor image fusion using the wavelet transform
Graphical Models and Image Processing
Transferring color to greyscale images
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
IEEE Computer Graphics and Applications
Nonseparable two- and three-dimensional wavelets
IEEE Transactions on Signal Processing
Nonseparable multidimensional perfect reconstruction filter banks and wavelet bases for Rn
IEEE Transactions on Information Theory - Part 2
Fusing remote sensing images using à trous wavelet transform and empirical mode decomposition
Pattern Recognition Letters
MRI and PET image fusion by combining IHS and retina-inspired models
Information Fusion
Digital Image Processing in Remote Sensing
SIBGRAPI-TUTORIALS '09 Proceedings of the 2009 Tutorials of the XXII Brazilian Symposium on Computer Graphics and Image Processing
Colour spaces for colour transfer
CCIW'11 Proceedings of the Third international conference on Computational color imaging
Multifocus image fusion and denoising: A variational approach
Pattern Recognition Letters
Hi-index | 0.10 |
In order to fuse two registered high spatial resolution panchromatic image and low spatial resolution multispectral image of the same scene, we proposed a new color transfer based fusion algorithm by using the non-separable wavelet frame transform (NWFT). Three bands are selected from the source multispectral image as the channels to be fused. A grayscale image is obtained by averaging these three bands. Histogram matching is performed on the source panchromatic image to obtain a new panchromatic image with a uniform histogram as the grayscale image acquired from the source multispectral image. The histogram matched panchromatic image is decomposed by the NWFT. The lowest frequency subband of the NWFT coefficients is substituted by the grayscale image acquired from the source multispectral image in order to produce the composite NWFT coefficients. A composite image is obtained by performing the inverse NWFT transform on the combined coefficients. Three bands selected from the source multispectral image are mapped into the RGB (red-green-blue) color space. The color information (strictly speaking, it is the spectral information of the source multispectral image) is transferred into the composite image by using a color transfer method in order to get the finally fused image. Experiment results show that the proposed algorithm works well in remote sensing image fusion.