The Equivalence of Half-Quadratic Minimization and the Gradient Linearization Iteration

  • Authors:
  • M. Nikolova;R. H. Chan

  • Affiliations:
  • Centre de Mathematiques et de Leurs Applications, Cachan;-

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

A popular way to restore images comprising edges is to minimize a cost function combining a quadratic data-fidelity term and an edge-preserving (possibly nonconvex) regularization term. Mainly because of the latter term, the calculation of the solution is slow and cumbersome. Half-quadratic (HQ) minimization (multiplicative form) was pioneered by Geman and Reynolds (1992) in order to alleviate the computational task in the context of image reconstruction with nonconvex regularization. By promoting the idea of locally homogeneous image models with a continuous-valued line process, they reformulated the optimization problem in terms of an augmented cost function which is quadratic with respect to the image and separable with respect to the line process, hence the name "half quadratic." Since then, a large amount of papers were dedicated to HQ minimization and important results-including edge-preservation along with convex regularization and convergence-have been obtained. In this paper, we show that HQ minimization (multiplicative form) is equivalent to the most simple and basic method where the gradient of the cost function is linearized at each iteration step. In fact, both methods give exactly the same iterations. Furthermore, connections of HQ minimization with other methods, such as the quasi-Newton method and the generalized Weiszfeld's method, are straightforward