Extended least squares based algorithm for training feedforward networks

  • Authors:
  • J. Y.F. Yam;T. W.S. Chow

  • Affiliations:
  • City Univ. of Hong Kong, Kowloon;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

An extended least squares-based algorithm for feedforward networks is proposed. The weights connecting the last hidden and output layers are first evaluated by least squares algorithm. The weights between input and hidden layers are then evaluated using the modified gradient descent algorithms. This arrangement eliminates the stalling problem experienced by the pure least squares type algorithms; however, still maintains the characteristic of fast convergence. In the investigated problems, the total number of FLOPS required for the networks to converge using the proposed training algorithm are only 0.221%-16.0% of that using the Levenberg-Marquardt algorithm. The number of floating point operations per iteration of the proposed algorithm are only 1.517-3.521 times of that of the standard backpropagation algorithm