Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
A relaxed version of Bregman's method for convex programming
Journal of Optimization Theory and Applications
A parallel subgradient projections method for the convex feasibility problem
Journal of Computational and Applied Mathematics
Convergence of some algorithms for convex minimization
Mathematical Programming: Series A and B - Special issue: Festschrift in Honor of Philip Wolfe part II: studies in nonlinear programming
Error stability properties of generalized gradient-type algorithms
Journal of Optimization Theory and Applications
Incremental Gradient Algorithms with Stepsizes Bounded Away from Zero
Computational Optimization and Applications
Strong Convergence of Block-Iterative Outer Approximation Methods for Convex Optimization
SIAM Journal on Control and Optimization
Mathematical methods in image reconstruction
Mathematical methods in image reconstruction
An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
SIAM Journal on Optimization
A New Class of Incremental Gradient Methods for Least Squares Problems
SIAM Journal on Optimization
Gradient Convergence in Gradient methods with Errors
SIAM Journal on Optimization
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 01
A Convergent Incremental Gradient Method with a Constant Step Size
SIAM Journal on Optimization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Convex set theoretic image recovery by extrapolated iterations of parallel subgradient projections
IEEE Transactions on Image Processing
An adaptive level set method for nondifferentiable constrained image recovery
IEEE Transactions on Image Processing
Image restoration subject to a total variation constraint
IEEE Transactions on Image Processing
Hi-index | 0.00 |
We present a unifying framework for nonsmooth convex minimization bringing together $\epsilon$-subgradient algorithms and methods for the convex feasibility problem. This development is a natural step for $\epsilon$-subgradient methods in the direction of constrained optimization since the Euclidean projection frequently required in such methods is replaced by an approximate projection, which is often easier to compute. The developments are applied to incremental subgradient methods, resulting in new algorithms suitable to large-scale optimization problems, such as those arising in tomographic imaging.