Training Recurrent Networks by Evolino
Neural Computation
Predictive Modeling with Echo State Networks
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Recurrence enhances the spatial encoding of static inputs in reservoir networks
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Minimum Complexity Echo State Network
IEEE Transactions on Neural Networks
Reservoir sizes and feedback weights interact non-linearly in echo state networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
This paper investigates the interaction between the driving output feedback and the internal reservoir dynamics in echo state networks (ESNs). The interplay is studied experimentally on the multiple superimposed oscillators (MSOs) benchmark. The experimental data reveals a dual effect of the output feedback strength on the network dynamics: it drives the dynamic reservoir but it can also block suitable reservoir dynamics. Moreover, the data shows that the reservoir size crucially co-determines the likelihood of generating an effective ESN. We show that dependent on the complexity of the MSO dynamics somewhat smaller networks can yield better performance. Optimizing the output feedback weight range and the network size is thus crucial for generating an effective ESN. With proper parameter choices, we show that it is possible to generate ESNs that approximate MSOs with several orders of magnitude smaller errors than those previously reported. We conclude that there appears to be still much more potential in ESNs than previously thought and sketch-out some promising future research directions.