A Layer-by-Layer Least Squares based Recurrent Networks Training Algorithm: Stalling and Escape

  • Authors:
  • Siu-Yeung Cho;Tommy W. S. Chow

  • Affiliations:
  • Dept. of Electronic Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong;Dept. of Electronic Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong

  • Venue:
  • Neural Processing Letters
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

The limitations of the least squares based trainingalgorithm is dominated by stalling problem andevaluation error by transformation matrix to obtainan unacceptable solution. This paper presents a newapproach for the recurrent networks trainingalgorithm based upon the Layer-by-Layer LeastSquares based algorithm to overcome theaforementioned problems. In accordance with ourproposed algorithm, all the weights are evaluated bythe least squares method without the evaluation oftransformation matrix to speed up the rate ofconvergence. A probabilistic mechanism, based uponthe modified weights updated equations, isintroduced to eliminate the stalling problemexperienced by the pure least squares typecomputation. As a result, the merits of the proposedalgorithm are capable of providing an ability ofescaping from local minima to obtain a good optimalsolution and still maintaining the characteristic offast convergence.