Video fusion performance evaluation based on structural similarity and human visual perception

  • Authors:
  • Qiang Zhang;Long Wang;Huijuan Li;Zhaokun Ma

  • Affiliations:
  • Center for Complex Systems, School of Mechano-electronic Engineering, Xidian University, Xi'an Shaanxi 710071, China;Center for Systems and Control, College of Engineering and Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing 100871, China;Center for Complex Systems, School of Mechano-electronic Engineering, Xidian University, Xi'an Shaanxi 710071, China;Center for Complex Systems, School of Mechano-electronic Engineering, Xidian University, Xi'an Shaanxi 710071, China

  • Venue:
  • Signal Processing
  • Year:
  • 2012

Quantified Score

Hi-index 0.08

Visualization

Abstract

In order to evaluate different video fusion algorithms in temporal stability and consistency as well as in spatial information transfer, a novel objective video fusion quality metric is proposed with the structural similarity (SSIM) index and the perception characteristics of human visual system (HVS) in this paper. Firstly, for each frame, two sub-indices, i.e., the spatial fusion quality index and the temporal fusion quality index, are defined by the weighted local SSIM indices. Secondly, for the current frame, an individual-frame fusion quality measure is obtained by integrating the above two sub-indices. Lastly, the proposed global video fusion metric is constructed as the weighted average of all the individual-frame fusion quality measures. In addition, according to the perception characteristics of HVS, some local and global spatial-temporal information, such as local variance, pixel movement, global contrast, background motion and so on, is employed to define the weights in the proposed metric. Several sets of experimental results demonstrate that the proposed metric can evaluate different video fusion algorithms accurately, and the evaluation results coincide with the subjective results well.