Online stability of backpropagation-decorrelation recurrent learning

  • Authors:
  • Jochen J. Steil

  • Affiliations:
  • Neuroinformatics Group, Faculty of Technology, Bielefeld University, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

We provide a stability analysis based on nonlinear feedback theory for the recently introduced backpropagation-decorrelation (BPDC) recurrent learning algorithm which adapts only the output weights of a possibly large network and therefore can learn in O(N). Using a small gain criterion, we derive a simple sufficient stability inequality. The condition can be monitored online to assure that the recurrent network is stable and can in principle be applied to any network adapting only the output weights. Based on these results the BPDC learning is further enhanced with an efficient online rescaling algorithm to stabilize the network while adapting. In simulations we find that this mechanism improves learning in the provably stable domain. As byproduct we show that BPDC is highly competitive on standard data sets including the recently introduced CATS benchmark data [CATS data. URL: http://www.cis.hut.fi/lendasse/competition/competition.html].