Dual techniques for constrained optimization
Journal of Optimization Theory and Applications
Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
SIAM Journal on Numerical Analysis
On the convergence of the exponential multiplier method for convex programming
Mathematical Programming: Series A and B
Analysis and implementation of a dual algorithm for constrained optimization
Journal of Optimization Theory and Applications
Nonlinear rescaling and proximal-like methods in convex optimization
Mathematical Programming: Series A and B
Proximal Minimization Methods with Generalized Bregman Functions
SIAM Journal on Control and Optimization
Interior Proximal and Multiplier Methods Based on Second Order Homogeneous Kernels
Mathematics of Operations Research
Algorithm 813: SPG—Software for Convex-Constrained Optimization
ACM Transactions on Mathematical Software (TOMS)
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Penalty/Barrier Multiplier Methods for Convex Programming Problems
SIAM Journal on Optimization
Computational Optimization and Applications
Large-Scale Active-Set Box-Constrained Optimization Method with Spectral Projected Gradients
Computational Optimization and Applications
Orthogonal packing of rectangular items within arbitrary convex regions by nonlinear optimization
Computers and Operations Research
Minimizing the object dimensions in circle and sphere packing problems
Computers and Operations Research
Computational Optimization and Applications
Quasi-Newton acceleration for equality-constrained minimization
Computational Optimization and Applications
Local convergence of an augmented Lagrangian method for matrix inequality constrained programming
Optimization Methods & Software
Improving ultimate convergence of an augmented Lagrangian method
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
Convergence properties of augmented Lagrangian methods for constrained global optimization
Optimization Methods & Software - THE JOINT EUROPT-OMS CONFERENCE ON OPTIMIZATION, 4-7 JULY, 2007, PRAGUE, CZECH REPUBLIC, PART I
Computers and Operations Research
An augmented Lagrangian fish swarm based method for global optimization
Journal of Computational and Applied Mathematics
On the convergence of augmented Lagrangian methods for nonlinear semidefinite programming
Journal of Global Optimization
An inexact restoration strategy for the globalization of the sSQP method
Computational Optimization and Applications
Journal of Global Optimization
Hi-index | 0.00 |
Augmented Lagrangian algorithms are very popular tools for solving nonlinear programming problems. At each outer iteration of these methods a simpler optimization problem is solved, for which efficient algorithms can be used, especially when the problems are large. The most famous Augmented Lagrangian algorithm for minimization with inequality constraints is known as Powell-Hestenes-Rockafellar (PHR) method. The main drawback of PHR is that the objective function of the subproblems is not twice continuously differentiable. This is the main motivation for the introduction of many alternative Augmented Lagrangian methods. Most of them have interesting interpretations as proximal point methods for solving the dual problem, when the original nonlinear programming problem is convex. In this paper a numerical comparison between many of these methods is performed using all the suitable problems of the CUTE collection.