Molecular conformations from distance matrices
Journal of Computational Chemistry
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Nonmonotone Globalization Techniques for the Barzilai-Borwein Gradient Method
Computational Optimization and Applications
Gradient Methods with Adaptive Step-Sizes
Computational Optimization and Applications
Memory gradient method with Goldstein line search
Computers & Mathematics with Applications
Convergence of memory gradient methods
International Journal of Computer Mathematics
Acceleration of the EM algorithm via extrapolation methods: Review, comparison and new methods
Computational Statistics & Data Analysis
Modified nonmonotone Armijo line search for descent method
Numerical Algorithms
A box constrained gradient projection algorithm for compressed sensing
Signal Processing
Gradient-Based Methods for Sparse Recovery
SIAM Journal on Imaging Sciences
A Barzilai-Borwein-based heuristic algorithm for locating multiple facilities with regional demand
Computational Optimization and Applications
The Chaotic Nature of Faster Gradient Descent Methods
Journal of Scientific Computing
A Dynamical Tikhonov Regularization for Solving Ill-posed Linear Algebraic Systems
Acta Applicandae Mathematicae: an international survey journal on applying mathematics and mathematical applications
A globally optimal tri-vector method to solve an ill-posed linear system
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
The negative gradient direction to find local minimizers has been associated with the classical steepest descent method which behaves poorly except for very well conditioned problems. We stress out that the poor behavior of the steepest descent methods is due to the optimal Cauchy choice of steplength and not to the choice of the search direction. We discuss over and under relaxation of the optimal steplength. In fact, we study and extend recent nonmonotone choices of steplength that significantly enhance the behavior of the method. For a new particular case (Cauchy-Barzilai-Borwein method), we present a convergence analysis and encouraging numerical results to illustrate the advantages of using nonmonotone overrelaxations of the gradient method.