New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation

  • Authors:
  • J. A. Ford;S. Tharmlikit

  • Affiliations:
  • Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, Essex, CO4 3SQ, UK;Department of Computer Science, Faculty of Science, Burapha University, Bangsaen, Chonburi 20131, Thailand

  • Venue:
  • Journal of Computational and Applied Mathematics - Proceedings of the international conference on recent advances in computational mathematics
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-step quasi-Newton methods for optimisation (using data from more than one previous step to revise the current approximate Hessian) were introduced by Ford and Moghrabi in (J. Comput. Appl. Math. 50 (1994) 305), where they showed how to construct such methods by means of interpolating curves. These methods also utilise standard quasi-Newton formulae, but with the vectors normally employed in the formulae replaced by others determined from a multi-step version of the secant equation. Some methods (the 'accumulative' and 'fixed-point' approaches) for defining the parameter values, which correspond to the iterates on the interpolating curve, were presented by Ford and Moghrabi in (Optim. Methods Software 2 (1993) 357). Both the accumulative and the fixed-point methods measure the distances required to parameterise the interpolating polynomials via a norm defined by a positive-definite matrix M. The fixed-point algorithm which takes M to be the current approximate Hessian was found, experimentally, to be the best of the six multi-step methods studied in Ford and Moghrabi (1993) (all of which exhibited improved numerical performance by comparison with the standard single-step BFGS method).To produce a better parameterisation of the interpolation, Ford (Comput. Math. Appl. 42 (2001) 1083) developed the idea of 'implicit update' methods. The fundamental concept here is to determine an 'improved' version of the Hessian approximation to be used in computing the metric, while avoiding the computational expense of actually calculating the improved version. Two implicit methods (denoted by I2 and I3) were developed from F2 in Ford (2001). The method I2 employed parameter values generated from an implicit single-step BFGS update, while I3 used values from an implicit two-step update. In this paper, we describe the derivation of new implicit updates which are similar to I3. The experimental results we present show that one of the new implicit methods produces markedly better performance than the existing implicit methods, particularly as the dimension of the test problem grows.