Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
A linearization method for nonsmooth stochastic programming problems
Mathematics of Operations Research
Error stability properties of generalized gradient-type algorithms
Journal of Optimization Theory and Applications
An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
SIAM Journal on Optimization
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
Nonlinear Optimization
MRF inference by k-fan decomposition and tight Lagrangian relaxation
ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
Hi-index | 0.00 |
We consider a version of the subgradient method for convex nonsmooth optimization involving subgradient averaging. Using a merit function approach in the space of decisions and subgradient estimates, we prove convergence of the primal variables to an optimal solution and of the dual variables to an optimal subgradient. Application to dual convex optimization problems is discussed.