A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization
Computational Optimization and Applications
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
On the nonmonotone line search
Journal of Optimization Theory and Applications
Nonmonotone Globalization Techniques for the Barzilai-Borwein Gradient Method
Computational Optimization and Applications
A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
SIAM Journal on Optimization
The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization
Computational Optimization and Applications
A limited memory BFGS-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
Numerical Algorithms
Journal of Computational and Applied Mathematics
Hi-index | 7.29 |
In this paper, we give some notes on the two modified spectral gradient methods which were developed in [10]. These notes present the relationship between their stepsize formulae and some new secant equations in the quasi-Newton method. In particular, we also introduce another two new choices of stepsize. By using an efficient nonmonotone line search technique, we propose some new spectral gradient methods. Under some mild conditions, we show that these proposed methods are globally convergent. Numerical experiments on a large number of test problems from the CUTEr library are also reported, which show that the efficiency of these proposed methods.