Neural Networks - 2004 Special issue: New developments in self-organizing systems
On global asymptotic stability of fully connected recurrent neural networks
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 06
Generative Modeling of Autonomous Robots and their Environments using Reservoir Computing
Neural Processing Letters
A smoothing regularizer for feedforward and recurrent neural networks
Neural Computation
Stable Output Feedback in Reservoir Computing Using Ridge Regression
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
LAB-RS '08 Proceedings of the 2008 ECSIS Symposium on Learning and Adaptive Behaviors for Robotic Systems
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out layer reduces the risk of error amplification, mitigates parameter dependency and boosts the task-specific performance of reservoir networks with output feedback. We discuss the deeper connection between regularization of the learning process and stability of the trained network.