Computation at the edge of chaos: phase transitions and emergent computation
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
The complex structured singular value
Automatica (Journal of IFAC) - Special issue on robust control
Real-time computation at the edge of chaos in recurrent neural networks
Neural Computation
Analysis and design of echo state networks
Neural Computation
A neurodynamical model for working memory
Neural Networks
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An echo state network (ESN) consists of a large, randomly connected neural network, the reservoir, which is driven by an input signal and projects to output units. During training, only the connections from the reservoir to these output units are learned. A key requisite for output-only training is the echo state property (ESP), which means that the effect of initial conditions should vanish as time passes. In this paper, we use analytical examples to show that a widely used criterion for the ESP, the spectral radius of the weight matrix being smaller than unity, is not sufficient to satisfy the echo state property. We obtain these examples by investigating local bifurcation properties of the standard ESNs. Moreover, we provide new sufficient conditions for the echo state property of standard sigmoid and leaky integrator ESNs. We furthermore suggest an improved technical definition of the echo state property, and discuss what practicians should (and should not) observe when they optimize their reservoirs for specific tasks.