Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
Duality in stochastic linear and dynamic programming
Duality in stochastic linear and dynamic programming
A generalization of Polyak's convergence result for subgradient optimization
Mathematical Programming: Series A and B
Proximity control in bundle methods for convex
Mathematical Programming: Series A and B
Variable target value subgradient method
Mathematical Programming: Series A and B
Subgradient method with entropic projections for convex nondifferentiable minimization
Journal of Optimization Theory and Applications
On the projected subgradient method for nonsmooth convex optimization in a Hilbert space
Mathematical Programming: Series A and B
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
Computational Optimization and Applications
Two “well-known” properties of subgradient optimization
Mathematical Programming: Series A and B - Series B - Special Issue: Nonsmooth Optimization and Applications
Local Linear Convergence for Alternating and Averaged Nonconvex Projections
Foundations of Computational Mathematics
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
Convergence Analysis of Deflected Conditional Approximate Subgradient Methods
SIAM Journal on Optimization
The effect of deterministic noise in subgradient methods
Mathematical Programming: Series A and B
Incremental Subgradients for Constrained Convex Optimization: A Unified Framework and New Methods
SIAM Journal on Optimization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A variable target value method for nondifferentiable optimization
Operations Research Letters
An adaptive level set method for nondifferentiable constrained image recovery
IEEE Transactions on Image Processing
Hi-index | 0.00 |
We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorithm). In particular, the iterates in our method can be infeasible throughout the whole procedure. Nevertheless, we provide conditions which ensure convergence to an optimal feasible point under suitable assumptions. One convergence result deals with step size sequences that are fixed a priori. Two other results handle dynamic Polyak-type step sizes depending on a lower or upper estimate of the optimal objective function value, respectively. Additionally, we briefly sketch two applications: Optimization with convex chance constraints, and finding the minimum ℓ1-norm solution to an underdetermined linear system, an important problem in Compressed Sensing.