Updating conjugate directions by the BRGS formula
Mathematical Programming: Series A and B
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
Using function-values in multi-step quasi-Newton methods
Proceedings of the 6th international congress on Computational and applied mathematics
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization
Computational Optimization and Applications
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
Global Convergence Properties of Nonlinear Conjugate Gradient Methods with Modified Secant Condition
Computational Optimization and Applications
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
Journal of Computational and Applied Mathematics
Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
Journal of Computational and Applied Mathematics
Multi-step nonlinear conjugate gradient methods for unconstrained minimization
Computational Optimization and Applications
Journal of Computational and Applied Mathematics
Two modified scaled nonlinear conjugate gradient methods
Journal of Computational and Applied Mathematics
Hi-index | 7.29 |
Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposed methods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their account of both the gradient and function values. Under proper conditions, we show that one of the proposed methods is globally convergent for general functions and that the other is globally convergent for uniformly convex functions. To enhance the performance of the line search procedure, we also propose a new approach for computing the initial steplength to be used for initiating the procedure. We provide a comparison of implementations of our methods with the efficient conjugate gradient methods proposed by Dai and Liao, and Hestenes and Stiefel. Numerical test results show the efficiency of our proposed methods.