Regularization and stability in reservoir networks with output feedback

  • Authors:
  • René Felix Reinhart;Jochen Jakob Steil

  • Affiliations:
  • Research Institute for Cognition and Robotics - CoR-Lab, Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany;Research Institute for Cognition and Robotics - CoR-Lab, Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out layer reduces the risk of error amplification, mitigates parameter dependency and boosts the task-specific performance of reservoir networks with output feedback. We discuss the deeper connection between regularization of the learning process and stability of the trained network.