Global and Superlinear Convergence of a Restricted Class of Self-Scaling Methods with Inexact Line Searches, for Convex Functions

  • Authors:
  • M. Al-Baali

  • Affiliations:
  • Department of Mathematics and Statistics, Sultan Gaboos University, Sultanate of Oman. E-mail: albaali@squ.edu.om

  • Venue:
  • Computational Optimization and Applications
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper studies the convergence properties of algorithms belonging tothe class of self-scaling (SS) quasi-Newton methods for unconstrainedoptimization. This class depends on two parameters, say&thetas;_k and τ_k, for which the choiceτ_k=1 gives the Broyden family of unscaled methods, where&thetas;_k=1 corresponds to the well known DFP method. Wepropose simple conditions on these parameters that give rise to globalconvergence with inexact line searches, for convex objective functions. Theq-superlinear convergence is achieved if further restrictions on thescaling parameter are introduced. These convergence results are anextension of the known results for the unscaled methods. Because the scalingparameter is heavily restricted, we consider a subclass of SS methods whichsatisfies the required conditions. Although convergence for the unscaledmethods with &thetas;_k ≥ 1 is still an open question, weshow that the global and superlinear convergence for SS methods is possibleand present, in particular, a new SS-DFP method.