Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
Alternating multi-step quasi-Newton methods for unconstrained optimization
ICCAM '96 Proceedings of the seventh international congress on Computational and applied mathematics
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization
Computational Optimization and Applications
New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
Journal of Computational and Applied Mathematics - Proceedings of the international conference on recent advances in computational mathematics
Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
Numerische Mathematik
A new gradient method via quasi-Cauchy relation which guarantees descent
Journal of Computational and Applied Mathematics
An improved multi-step gradient-type method for large scale optimization
Computers & Mathematics with Applications
Hi-index | 0.09 |
In this paper, we propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein (BB) method. The innovation features of this approach consist in using approximation of the Hessian in diagonal matrix form based on the modified weak secant equation rather than the multiple of the identity matrix in the BB method. Using this approach, we can obtain a higher order accuracy of Hessian approximation when compares to other existing BB-type method. By incorporating a simple monotone strategy, the global convergence of the new method is achieved. Practical insights into the effectiveness of the proposed method are given by numerical comparison with the BB method and its variant.