Two descent hybrid conjugate gradient methods for optimization
Journal of Computational and Applied Mathematics
A limited memory BFGS-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
Journal of Computational and Applied Mathematics
A modified CG-DESCENT method for unconstrained optimization
Journal of Computational and Applied Mathematics
Sufficient descent directions in unconstrained optimization
Computational Optimization and Applications
SIAM Journal on Imaging Sciences
SIAM Journal on Optimization
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. It is well-known that the direction generated by a conjugate gradient method may not be a descent direction of the objective function. In this paper, we take a little modification to the Fletcher–Reeves (FR) method such that the direction generated by the modified method provides a descent direction for the objective function. This property depends neither on the line search used, nor on the convexity of the objective function. Moreover, the modified method reduces to the standard FR method if line search is exact. Under mild conditions, we prove that the modified method with Armijo-type line search is globally convergent even if the objective function is nonconvex. We also present some numerical results to show the efficiency of the proposed method.