Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization
Computational Optimization and Applications
On the asymptotic behaviour of some new gradient methods
Mathematical Programming: Series A and B
A new two-step gradient-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
An improved multi-step gradient-type method for large scale optimization
Computers & Mathematics with Applications
A matrix-free quasi-Newton method for solving large-scale nonlinear systems
Computers & Mathematics with Applications
Hi-index | 7.29 |
We propose a new monotone algorithm for unconstrained optimization in the frame of Barzilai and Borwein (BB) method and analyze the convergence properties of this new descent method. Motivated by the fact that BB method does not guarantee descent in the objective function at each iteration, but performs better than the steepest descent method, we therefore attempt to find stepsize formula which enables us to approximate the Hessian based on the Quasi-Cauchy equation and possess monotone property in each iteration. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the BB method.