Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
Efficient projections onto the l1-ball for learning in high dimensions
Proceedings of the 25th international conference on Machine learning
Algorithm 890: Sparco: A Testing Framework for Sparse Reconstruction
ACM Transactions on Mathematical Software (TOMS)
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
SIAM Journal on Optimization
Probing the Pareto Frontier for Basis Pursuit Solutions
SIAM Journal on Scientific Computing
Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
Fast image recovery using variable splitting and constrained optimization
IEEE Transactions on Image Processing
SIAM Journal on Scientific Computing
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
SIAM Journal on Imaging Sciences
On sparse representations in arbitrary redundant bases
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this article, we propose an algorithm, nesta-lasso, for the lasso problem, i.e., an underdetermined linear least-squares problem with a 1-norm constraint on the solution. We prove under the assumption of the restricted isometry property (rip) and a sparsity condition on the solution, that nesta-lasso is guaranteed to be almost always locally linearly convergent. As in the case of the algorithm nesta, proposed by Becker, Bobin, and Candès, we rely on Nesterov's accelerated proximal gradient method, which takes $O(\sqrt {1/\varepsilon })$ iterations to come within $\varepsilon 0$ of the optimal value. We introduce a modification to Nesterov's method that regularly updates the prox-center in a provably optimal manner. The aforementioned linear convergence is in part due to this modification. In the second part of this article, we attempt to solve the basis pursuit denoising (bpdn) problem (i.e., approximating the minimum 1-norm solution to an underdetermined least squares problem) by using nesta-lasso in conjunction with the Pareto root-finding method employed by van den Berg and Friedlander in their spgl1 solver. The resulting algorithm is called parnes. We provide numerical evidence to show that it is comparable to currently available solvers.