Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Optimal conditioning and convergence in rank one quasi-Newton updates
SIAM Journal on Numerical Analysis
Convergence of quasi-Newton matrices generated by the symmetric rank one update
Mathematical Programming: Series A and B
Sizing and least-change secant methods
SIAM Journal on Numerical Analysis
Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization
A Modified Rank One Update Which Converges Q-Superlinearly
Computational Optimization and Applications
A restarting approach for the symmetric rank one update for unconstrained optimization
Computational Optimization and Applications
Computers & Mathematics with Applications
Hi-index | 7.29 |
Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In this paper, we propose some modified SR1 updates based on the modified secant equations, which use both gradient and function information. Furthermore, to avoid the loss of positive definiteness and zero denominators of the new SR1 updates, we apply a restart procedure to this update. Three new algorithms are given to improve the Hessian approximation with modified secant equations for the SR1 method. Numerical results show that the proposed algorithms are very encouraging and the advantage of the proposed algorithms over the standard SR1 and BFGS updates is clearly observed.