Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
A tool for the analysis of Quasi-Newton methods with application to unconstrained minimization
SIAM Journal on Numerical Analysis
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
Convergence Properties of the BFGS Algoritm
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
Hi-index | 0.00 |
In this article, based on the modified secant equation, we propose a modified Hestenes-Stiefel (HS) conjugate gradient method which has similar form as the CG-DESCENT method proposed by Hager and Zhang (SIAM J Optim 16:170---192, 2005). The presented method can generate sufficient descent directions without any line search. Under some mild conditions, we show that it is globally convergent with Armijo line search. Moreover, the R-linear convergence rate of the modified HS method is established. Preliminary numerical results show that the proposed method is promising, and competitive with the well-known CG-DESCENT method.