Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Efficient hybrid conjugate gradient techniques
Journal of Optimization Theory and Applications
Global convergence result for conjugate gradient methods
Journal of Optimization Theory and Applications
Optimization: algorithms and consistent approximations
Optimization: algorithms and consistent approximations
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Hi-index | 0.00 |
Generally, global convergence of the conjugate gradient methods for unconstrained optimization problems needs Wolfe line search or strong Wolfe line search. In this article, we propose a cautious DY conjugate gradient method and prove that this method with Armijo line search converges globally if the objective function has Lipschitz continuous gradients. We also present some preliminary numerical results to show the efficiency of the proposed method.