On large scale nonlinear least squares calculations
SIAM Journal on Scientific and Statistical Computing
Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
Alternating multi-step quasi-Newton methods for unconstrained optimization
ICCAM '96 Proceedings of the seventh international congress on Computational and applied mathematics
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Memory gradient method with Goldstein line search
Computers & Mathematics with Applications
Hi-index | 0.09 |
Of the multistep quasi-Newton methods introduced by the authors in [1], the mostsuccessful was the so-called fixed-point method using the existing Hessian approximation to compute, at each iteration, the parameters required in the interpolation. In order to avoid the burden of computing the additional matrix-vector products required by this approach, approximations based on the secant equation were proposed. In [2], a different approach to dealing with this difficulty was proposed, in which standard single-step quasi-Newton updates were alternated, on successive iterations, with two-step updates, so that approximations were no longer necessary. Recent work has shown that the quantities required to compute the parameters referred to above may be computed exactly by means of a recurrence, so that the technique of alternation is no longer the only alternative if we wish to avoid approximations. In this paper, we describe the derivation of this recurrence. We present the results of a range of numerical experiments to compare and evaluate the three approaches of approximation, alternation, and recurrence. Finally, we show how the use of recurrences may be extended to multistep methods employing three or more steps.