Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method

  • Authors:
  • Amit Bhaya;Eugenius Kaszkurewicz

  • Affiliations:
  • Department of Electrical Engineering, Federal University of Rio de Janeiro, PEE/COPPE/UFRJ, P.O. Box 68504, Rio de Janeiro, RJ 21945-970, Brazil;Department of Electrical Engineering, Federal University of Rio de Janeiro, PEE/COPPE/UFRJ, P.O. Box 68504, Rio de Janeiro, RJ 21945-970, Brazil

  • Venue:
  • Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so called learning rate and momentum parameters are obtained using a control Liapunov function analysis of the system.