Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
A generalized subgradient method with relaxation step
Mathematical Programming: Series A and B
Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
SIAM Journal on Optimization
Convergence of the Nelder--Mead Simplex Method to a Nonstationary Point
SIAM Journal on Optimization
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
A convergent variant of the Nelder-Mead algorithm
Journal of Optimization Theory and Applications
Augmented Lagrangian Duality and Nondifferentiable Optimization Methods in Nonconvex Programming
Journal of Global Optimization
Nonlinear Lagrange Duality Theorems and Penalty Function Methods In Continuous Optimization
Journal of Global Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
On a Modified Subgradient Algorithm for Dual Problems via Sharp Augmented Lagrangian*
Journal of Global Optimization
The Lagrangian Globalization Method for Nonsmooth Constrained Equations
Computational Optimization and Applications
A variable target value method for nondifferentiable optimization
Operations Research Letters
A primal dual modified subgradient algorithm with sharp Lagrangian
Journal of Global Optimization
Hi-index | 0.00 |
We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence properties of the MSG algorithm are preserved for the IMSG algorithm. Inexact minimization may allow to solve problems with less computational effort. We illustrate this through test problems, including an optimal bang-bang control problem, under several different inexactness schemes.