Sizing and least-change secant methods
SIAM Journal on Numerical Analysis
Alternating multi-step quasi-Newton methods for unconstrained optimization
ICCAM '96 Proceedings of the seventh international congress on Computational and applied mathematics
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization
Computational Optimization and Applications
New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
Journal of Computational and Applied Mathematics - Proceedings of the international conference on recent advances in computational mathematics
A new gradient method via quasi-Cauchy relation which guarantees descent
Journal of Computational and Applied Mathematics
A new two-step gradient-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
Hi-index | 0.09 |
In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrained optimization. Our approach is based on constructing a new gradient-type method by means of interpolating curves. We measure the distances required to parameterize the interpolating polynomials via a norm defined by a positive-definite matrix. By developing on implicit updating approach we can obtain an improved version of Hessian approximation in diagonal matrix form, while avoiding the computational expenses of actually calculating the improved version of the approximation matrix. The effectiveness of our proposed method is evaluated by means of computational comparison with the BB method and its variants. We show that our method is globally convergent and only requires O(n) memory allocations.