The symmetric rank-one quasi-Newton method is a space-dilation subgradient algorithm

  • Authors:
  • Michael J. Todd

  • Affiliations:
  • School of Operations Research and Industrial Engineering, College of Engineering, Cornell University, Ithaca, NY 14853-7501, USA

  • Venue:
  • Operations Research Letters
  • Year:
  • 1986

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is well-known that a particular choice of the parameters in Shor's subgradient algorithm with space dilation in the direction of the gradient yields the ellipsoid method. We show that another choice of these parameters leads to the quasi-Newton method that uses the symmetric rank-one update and direct prediction. One curious feature is that the sequence of approximate inverse Hessian matrices lags by one in the former description; this necessitates a different starting matrix or an unusual update at the first step. While the similarity between update formulae in space-dilation and quasi-Newton methods has been observed by several researchers, our result seems to be the first showing a precise equivalence. We note that apparently this equivalence cannot be extended to other quasi-Newton methods (for smooth optimization) or space-dilation methods (for non-smooth optimization). Nevertheless, we feel that our result gives insight into the relationship between these two classes of algorithms, and hope that it will suggest new efficient methods for non-smooth problems.