Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
Mathematical Programming: Series A and B
A tool for the analysis of Quasi-Newton methods with application to unconstrained minimization
SIAM Journal on Numerical Analysis
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization
A combined class of self-scaling and modified quasi-Newton methods
Computational Optimization and Applications
Hi-index | 0.00 |
In this paper, we propose a derivative-free quasi-Newton condition, which results in a new class of quasi-Newton updating formulas for unconstrained optimization. Each updating formula in this class is a rank-two updating formula and preserves the positive definiteness of the second derivative matrix of the quadratic model. Its first two terms are the same as the first two terms of the BFGS updating formula. We establish global convergence of quasi-Newton methods based upon the updating formulas in this class, and superlinear convergence of a special quasi-Newton method among them. Then we propose a special quasi-Newton updating formula, which repetitively uses the new quasi-Newton condition. This updating formula is derivative-free. Numerical results are reported.