An efficient ADMM algorithm for multidimensional anisotropic total variation regularization problems

  • Authors:
  • Sen Yang;Jie Wang;Wei Fan;Xiatian Zhang;Peter Wonka;Jieping Ye

  • Affiliations:
  • Arizona State University, Tempe, Arizona, USA;Arizona State University, Tempe, Arizona, USA;Huawei Noah's Ark Lab, Hong Kong, China;Huawei Noah's Ark Lab, Hong Kong, China;Arizona State University, Tempe, Arizona, USA;Arizona State University, Tempe, Arizona, USA

  • Venue:
  • Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Total variation (TV) regularization has important applications in signal processing including image denoising, image deblurring, and image reconstruction. A significant challenge in the practical use of TV regularization lies in the nondifferentiable convex optimization, which is difficult to solve especially for large-scale problems. In this paper, we propose an efficient alternating augmented Lagrangian method (ADMM) to solve total variation regularization problems. The proposed algorithm is applicable for tensors, thus it can solve multidimensional total variation regularization problems. One appealing feature of the proposed algorithm is that it does not need to solve a linear system of equations, which is often the most expensive part in previous ADMM-based methods. In addition, each step of the proposed algorithm involves a set of independent and smaller problems, which can be solved in parallel. Thus, the proposed algorithm scales to large size problems. Furthermore, the global convergence of the proposed algorithm is guaranteed, and the time complexity of the proposed algorithm is O(dN/ε) on a d-mode tensor with N entries for achieving an ε-optimal solution. Extensive experimental results demonstrate the superior performance of the proposed algorithm in comparison with current state-of-the-art methods.