Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization
Computational Optimization and Applications
A limited memory BFGS-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Optimization Methods & Software
A practical update criterion for SQP method
Optimization Methods & Software
A new class of quasi-Newton updating formulas
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
A globally convergent BFGS method with nonmonotone line search for non-convex minimization
Journal of Computational and Applied Mathematics
Theoretical analysis of evolutionary computation on continuously differentiable functions
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Convergence analysis of a modified BFGS method on convex minimizations
Computational Optimization and Applications
Improved Hessian approximation with modified secant equations for symmetric rank-one method
Journal of Computational and Applied Mathematics
SIAM Journal on Optimization
A reduced Hessian SQP method for inequality constrained optimization
Computational Optimization and Applications
Robust output-feedback controller design via local BMI optimization
Automatica (Journal of IFAC)
A nonmonotone PSB algorithm for solving unconstrained optimization
Computational Optimization and Applications
A regularized limited memory BFGS method for nonconvex unconstrained minimization
Numerical Algorithms
Hi-index | 0.00 |
This paper is concerned with the open problem of whether the BFGS method with inexact line search converges globally when applied to nonconvex unconstrained optimization problems. We propose a cautious BFGS update and prove that the method with either a Wolfe-type or an Armijo-type line search converges globally if the function to be minimized has Lipschitz continuous gradients.