Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
Using function-values in multi-step quasi-Newton methods
Proceedings of the 6th international congress on Computational and applied mathematics
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
Global convergence of nonmonotone descent methods for unconstrained optimization problems
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization
Computational Optimization and Applications
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Convergence of nonmonotone line search method
Journal of Computational and Applied Mathematics
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
Journal of Computational and Applied Mathematics
Multi-step nonlinear conjugate gradient methods for unconstrained minimization
Computational Optimization and Applications
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
A new family of conjugate gradient methods
Journal of Computational and Applied Mathematics
Two new conjugate gradient methods based on modified secant equations
Journal of Computational and Applied Mathematics
Notes on the Dai-Yuan-Yuan modified spectral gradient method
Journal of Computational and Applied Mathematics
Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
Numerical Algorithms
Hi-index | 7.29 |
Conjugate gradient methods have played a special role for solving large scale optimization problems due to the simplicity of their iteration, convergence properties and their low memory requirements. In this work, we propose a new class of spectral conjugate gradient methods which ensures sufficient descent independent of the accuracy of the line search. Moreover, an attractive property of our proposed methods is that they achieve a high-order accuracy in approximating the second order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. [S. Babaie-Kafaki, R. Ghanbari, N. Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations, Journal of Computational and Applied Mathematics 234 (2010) 1374-1386]. Further, a global convergence result for general functions is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed methods are preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.