Integrated multilevel image fusion and match score fusion of visible and infrared face images for robust face recognition

  • Authors:
  • Richa Singh;Mayank Vatsa;Afzel Noore

  • Affiliations:
  • Lane Department of Computer Science and Electrical Engineering, West Virginia University, USA;Lane Department of Computer Science and Electrical Engineering, West Virginia University, USA;Lane Department of Computer Science and Electrical Engineering, West Virginia University, USA

  • Venue:
  • Pattern Recognition
  • Year:
  • 2008

Quantified Score

Hi-index 0.02

Visualization

Abstract

This paper presents an integrated image fusion and match score fusion of multispectral face images. The fusion of visible and long wave infrared face images is performed using 2@n-granular SVM which uses multiple SVMs to learn both the local and global properties of the multispectral face images at different granularity levels and resolution. The 2@n-GSVM performs accurate classification which is subsequently used to dynamically compute the weights of visible and infrared images for generating a fused face image. 2D log polar Gabor transform and local binary pattern feature extraction algorithms are applied to the fused face image to extract global and local facial features, respectively. The corresponding match scores are fused using Dezert Smarandache theory of fusion which is based on plausible and paradoxical reasoning. The efficacy of the proposed algorithm is validated using the Notre Dame and Equinox databases and is compared with existing statistical, learning, and evidence theory based fusion algorithms.