Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Global convergence result for conjugate gradient methods
Journal of Optimization Theory and Applications
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A three-parameter family of nonlinear conjugate gradient methods
Mathematics of Computation
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
Relaxed Steepest Descent and Cauchy-Barzilai-Borwein Method
Computational Optimization and Applications
Nonmonotone Globalization Techniques for the Barzilai-Borwein Gradient Method
Computational Optimization and Applications
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
The use of alternation and recurrences in two-step quasi-Newton methods
Computers & Mathematics with Applications
Three-step fixed-point quasi-Newtonmethods for unconstrained optimisation
Computers & Mathematics with Applications
Hi-index | 0.09 |
In this paper, we present a multi-step memory gradient method with Goldstein line search for unconstrained optimization problems and prove its global convergence under some mild conditions. We also prove the linear convergence rate of the new method when the objective function is uniformly convex. Numerical results show that the new algorithm is suitable to solve large-scale optimization problems and is more stable than other similar methods in practical computation.