Primal dual algorithms for convex models and applications to image restoration, registration and nonlocal inpainting

  • Authors:
  • Tony F. Chan;John Ernest Esser

  • Affiliations:
  • University of California, Los Angeles;University of California, Los Angeles

  • Venue:
  • Primal dual algorithms for convex models and applications to image restoration, registration and nonlocal inpainting
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The main subject of this dissertation is a class of practical algorithms for minimizing convex non-differentiable functionals coming from image processing problems defined as variational models. This work builds largely on the work of Goldstein and Osher [GO09] and Zhu and Chan [ZC08] who proposed respectively the split Bregman and the primal dual hybrid gradient (PDHG) methods for total variation (TV) image restoration. We relate these algorithms to classical methods and generalize their applicability. We also propose new convex variational models for image registration and patch-based nonlocal inpainting and solve them with a variant of the PDHG method. We draw connections between popular methods for convex optimization in image processing by putting them in a general framework of Lagrangian-based alternating direction methods. Furthermore, operator splitting and decomposition techniques are used to generalize their application to a large class of problems, namely minimizing sums of convex functions composed with linear operators and subject to convex constraints. Numerous problems in image and signal processing such as denoising, deblurring, basis pursuit, segmentation, inpainting and many more can be modeled as minimizing exactly such functionals. Numerical examples focus especially on when it is possible to minimize such functionals by solving a sequence of simple convex minimization problems with explicit formulas for their solutions. In the case of the split Bregman method, we point out an equivalence to the classical alternating direction method of multipliers (ADMM) and Douglas Rachford splitting methods. Existing convergence arguments and some minor extensions justify application to common image processing problems. In the case of PDHG, its general convergence is still an open problem, but in joint work with Xiaoqun Zhang and Tony Chan we propose a simple modification that guarantees convergence. We also show convergence of some special cases of the original method. Numerical examples show PDHG and its variants to be especially well suited for large scale problems because their simple, explicit iterations can be constructed to avoid the need to invert large matrices at each iteration. The two proposed convex variational models for image registration and non-local inpainting are novel because most existing variational approaches require minimizing nonconvex functionals.