ParNes: a rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals

  • Authors:
  • Ming Gu;Lek-Heng Lim;Cinna Julie Wu

  • Affiliations:
  • Department of Mathematics, University of California at Berkeley, Berkeley, USA 94720-3840;Department of Statistics, University of Chicago, Chicago, USA 60637-1514;Department of Mathematics, University of California at Berkeley, Berkeley, USA 94720-3840

  • Venue:
  • Numerical Algorithms
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article, we propose an algorithm, nesta-lasso, for the lasso problem, i.e., an underdetermined linear least-squares problem with a 1-norm constraint on the solution. We prove under the assumption of the restricted isometry property (rip) and a sparsity condition on the solution, that nesta-lasso is guaranteed to be almost always locally linearly convergent. As in the case of the algorithm nesta, proposed by Becker, Bobin, and Candès, we rely on Nesterov's accelerated proximal gradient method, which takes $O(\sqrt {1/\varepsilon })$ iterations to come within $\varepsilon 0$ of the optimal value. We introduce a modification to Nesterov's method that regularly updates the prox-center in a provably optimal manner. The aforementioned linear convergence is in part due to this modification. In the second part of this article, we attempt to solve the basis pursuit denoising (bpdn) problem (i.e., approximating the minimum 1-norm solution to an underdetermined least squares problem) by using nesta-lasso in conjunction with the Pareto root-finding method employed by van den Berg and Friedlander in their spgl1 solver. The resulting algorithm is called parnes. We provide numerical evidence to show that it is comparable to currently available solvers.