A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Some numerical experiments with variable-storage quasi-Newton algorithms
Mathematical Programming: Series A and B
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
Molecular conformations from distance matrices
Journal of Computational Chemistry
Preconditioners for distance matrix algorithms
Journal of Computational Chemistry
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Iterative solution of nonlinear equations in several variables
Iterative solution of nonlinear equations in several variables
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Relaxed Steepest Descent and Cauchy-Barzilai-Borwein Method
Computational Optimization and Applications
Convergence of nonmonotone line search method
Journal of Computational and Applied Mathematics
Memory gradient method with Goldstein line search
Computers & Mathematics with Applications
Nonmonotone derivative-free methods for nonlinear equations
Computational Optimization and Applications
Journal of Computational and Applied Mathematics
A new family of conjugate gradient methods
Journal of Computational and Applied Mathematics
Computational Optimization and Applications
On Nonmonotone Chambolle Gradient Projection Algorithms for Total Variation Image Restoration
Journal of Mathematical Imaging and Vision
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
Notes on the Dai-Yuan-Yuan modified spectral gradient method
Journal of Computational and Applied Mathematics
A Barzilai-Borwein-based heuristic algorithm for locating multiple facilities with regional demand
Computational Optimization and Applications
A nonmonotone PSB algorithm for solving unconstrained optimization
Computational Optimization and Applications
Hi-index | 0.01 |
In this paper we propose new globalization strategies for the Barzilai and Borwein gradient method, based on suitable relaxations of the monotonicity requirements. In particular, we define a class of algorithms that combine nonmonotone watchdog techniques with nonmonotone linesearch rules and we prove the global convergence of these schemes. Then we perform an extensive computational study, which shows the effectiveness of the proposed approach in the solution of large dimensional unconstrained optimization problems.