Letters: Noise robust face hallucination employing Gaussian-Laplacian mixture model

  • Authors:
  • Zhong-Yuan Wang;Zhen Han;Rui-Min Hu;Jun-Jun Jiang

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

Because of the excellent ability to characterize the sparsity of natural images, @?"1-norm sparse representation (SR) is widely used to formulate the linear combination relationship in dictionary-learning-based face hallucination. However, due to inherently less sparse nature of noisy images, Laplacian prior assumed for @?"1-norm seems aggressive in terms of sparsity, which ultimately leads to significant degradation of hallucination performance in the presence of noise. To this end, we suggest a moderately sparse prior model referred to as a Gaussian-Laplacian mixture (GLM) distribution and employ it to infer the optimal solution under the Bayesian framework. The resulting regularization method known elastic net (EN) not only maintains same hallucination performance as SR under noise free scenarios but also outperforms the latter remarkably in the presence of noise. The experimental results on simulation and real-world noisy images show its superiority over some state-of-the-art methods.