Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Line search algorithms with guaranteed sufficient decrease
ACM Transactions on Mathematical Software (TOMS)
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Hi-index | 7.29 |
Although the Liu-Storey (LS) nonlinear conjugate gradient method has a similar structure as the well-known Polak-Ribiere-Polyak (PRP) and Hestenes-Stiefel (HS) methods, research about this method is very rare. In this paper, based on the memoryless BFGS quasi-Newton method, we propose a new LS type method, which converges globally for general functions with the Grippo-Lucidi line search. Moreover, we modify this new LS method such that the modified scheme is globally convergent for nonconvex minimization if the strong Wolfe line search is used. Numerical results are also reported.