SIAM Journal on Numerical Analysis
Journal of Optimization Theory and Applications
Superlinear Convergence of a Stabilized SQP Method to a Degenerate Solution
Computational Optimization and Applications
Stabilized Sequential Quadratic Programming
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Inexact-restoration algorithm for constrained optimization
Journal of Optimization Theory and Applications
Lancelot: A FORTRAN Package for Large-Scale Nonlinear Optimization (Release A)
Lancelot: A FORTRAN Package for Large-Scale Nonlinear Optimization (Release A)
Modifying SQP for Degenerate Problems
SIAM Journal on Optimization
Local Convergence of the Proximal Point Algorithm and Multiplier Methods Without Monotonicity
Mathematics of Operations Research
Numerical Comparison of Augmented Lagrangian Algorithms for Nonconvex Problems
Computational Optimization and Applications
Augmented Lagrangian methods under the constant positive linear dependence constraint qualification
Mathematical Programming: Series A and B
On the Convergence of Augmented Lagrangian Methods for Constrained Global Optimization
SIAM Journal on Optimization
On Augmented Lagrangian Methods with General Lower-Level Constraints
SIAM Journal on Optimization
Mathematical Programming: Series A and B
Global minimization using an Augmented Lagrangian method with variable lower-level constraints
Mathematical Programming: Series A and B
Mathematical Programming: Series A and B
Expository & survey paper: Multiplier methods: A survey
Automatica (Journal of IFAC)
The boundedness of penalty parameters in an augmented Lagrangian method with constrained subproblems
Optimization Methods & Software
Hi-index | 0.00 |
A globally convergent algorithm based on the stabilized sequential quadratic programming (sSQP) method is presented in order to solve optimization problems with equality constraints and bounds. This formulation has attractive features in the sense that constraint qualifications are not needed at all. In contrast with classic globalization strategies for Newton-like methods, we do not make use of merit functions. Our scheme is based on performing corrections on the solutions of the subproblems by using an inexact restoration procedure. The presented method is well defined and any accumulation point of the generated primal sequence is either a Karush-Kuhn-Tucker point or a stationary (maybe feasible) point of the problem of minimizing the infeasibility. Also, under suitable hypotheses, the sequence generated by the algorithm converges Q-linearly. Numerical experiments are given to confirm theoretical results.