Applied Mathematics and Optimization
Superlinear Convergence of a Stabilized SQP Method to a Degenerate Solution
Computational Optimization and Applications
Stabilized Sequential Quadratic Programming
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Modifying SQP for Degenerate Problems
SIAM Journal on Optimization
SIAM Journal on Optimization
On the Sequential Quadratically Constrained Quadratic Programming Methods
Mathematics of Operations Research
A Globally Convergent Linearly Constrained Lagrangian Method for Nonlinear Optimization
SIAM Journal on Optimization
Computational Optimization and Applications
Numerical Optimization: Theoretical and Practical Aspects (Universitext)
Numerical Optimization: Theoretical and Practical Aspects (Universitext)
Computational Optimization and Applications
On attraction of Newton-type iterates to multipliers violating second-order sufficiency conditions
Mathematical Programming: Series A and B
Computational Optimization and Applications
Mathematical Programming: Series A and B
Mathematical Programming: Series A and B
A Truncated SQP Method Based on Inexact Interior-Point Solutions of Subproblems
SIAM Journal on Optimization
Mathematical Programming: Series A and B
Hi-index | 0.00 |
As is well known, $Q$-superlinear or $Q$-quadratic convergence of the primal-dual sequence generated by an optimization algorithm does not, in general, imply $Q$-superlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQP) algorithm, local primal-dual quadratic convergence can be established under the assumptions of uniqueness of the Lagrange multiplier associated to the solution and the second-order sufficient condition. At the same time, previous primal $Q$-superlinear convergence results for SQP required strengthening of the first assumption to the linear independence constraint qualification. In this paper, we show that this strengthening of assumptions is actually not necessary. Specifically, we show that once primal-dual convergence is assumed or already established, for primal superlinear rate one needs only a certain error bound estimate. This error bound holds, for example, under the second-order sufficient condition, which is needed for primal-dual local analysis in any case. Moreover, in some situations even second-order sufficiency can be relaxed to the weaker assumption that the multiplier in question is noncritical. Our study is performed for a rather general perturbed SQP framework which covers, in addition to SQP and quasi-Newton SQP, some other algorithms as well. For example, as a byproduct, we obtain primal $Q$-superlinear convergence results for the linearly constrained (augmented) Lagrangian methods for which no primal $Q$-superlinear rate of convergence results were previously available. Another application of the general framework is sequential quadratically constrained quadratic programming methods. Finally, we discuss some difficulties with proving primal superlinear convergence for the stabilized version of SQP.