A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Algorithm 813: SPG—Software for Convex-Constrained Optimization
ACM Transactions on Mathematical Software (TOMS)
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Relaxed Steepest Descent and Cauchy-Barzilai-Borwein Method
Computational Optimization and Applications
A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
SIAM Journal on Optimization
Smooth and adaptive gradient method with retards
Mathematical and Computer Modelling: An International Journal
Iterative regularization algorithms for constrained image deblurring on graphics processors
Journal of Global Optimization
A cyclic projected gradient method
Computational Optimization and Applications
Hi-index | 0.00 |
Motivated by the superlinear behavior of the Barzilai-Borwein (BB) method for two-dimensional quadratics, we propose two gradient methods which adaptively choose a small step-size or a large step-size at each iteration. The small step-size is primarily used to induce a favorable descent direction for the next iteration, while the large step-size is primarily used to produce a sufficient reduction. Although the new algorithms are still linearly convergent in the quadratic case, numerical experiments on some typical test problems indicate that they compare favorably with the BB method and some other efficient gradient methods.