Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Trust-region methods
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Relaxed Steepest Descent and Cauchy-Barzilai-Borwein Method
Computational Optimization and Applications
On the Convergence of Descent Algorithms
Computational Optimization and Applications
Global convergence of nonmonotone descent methods for unconstrained optimization problems
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
A gradient-related algorithm with inexact line searches
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
In this paper we present a new class of memory gradient methods for unconstrained optimization problems and develop some useful global convergence properties under some mild conditions. In the new algorithms, trust region approach is used to guarantee the global convergence. Numerical results show that some memory gradient methods are stable and efficient in practical computation. In particular, some memory gradient methods can be reduced to the BB method in some special cases.