Fast Second-Order Gradient Descent via O(n) Curvature Matrix-Vector Products

  • Authors:
  • Nicol N. Schraudolph

  • Affiliations:
  • -

  • Venue:
  • Fast Second-Order Gradient Descent via O(n) Curvature Matrix-Vector Products
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a generic method for iteratively approximating various second-order gradient steps - Newton, Gauss-Newton, Levenberg-Marquardt, and natural gradient - in linear time per iteration, using special curvature matrix-vector products that can be computed in O(n). Two recent acceleration techniques for online learning, matrix momentum and stochastic meta-descent (SMD), in fact implement this approach. Since both were originally derived by very different routes, this offers fresh insight into their operation, resulting in further improvements to SMD.