Convergence of an Iterative Method for Total Variation Denoising

  • Authors:
  • David C. Dobson;Curtis R. Vogel

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Numerical Analysis
  • Year:
  • 1997

Quantified Score

Hi-index 0.01

Visualization

Abstract

In total variation denoising, one attempts to remove noise from a signal or image by solving a nonlinear minimization problem involving a total variation criterion. Several approaches based on this idea have recently been shown to be very effective, particularly for denoising functions with discontinuities. This paper analyzes the convergence of an iterative method for solving such problems. The iterative method involves a "lagged diffusivity" approach in which a sequence of linear diffusion problems are solved. Global convergence in a finite-dimensional setting is established, and local convergence properties, including rates and their dependence on various parameters, are examined.