Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Analysis and design of echo state networks
Neural Computation
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Recurrence enhances the spatial encoding of static inputs in reservoir networks
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
An approach to reservoir computing design and training
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Reservoir Computing (RC) offers a computationally efficient and well performing technique for using the temporal processing power of Recurrent Neural Networks (RNNs), while avoiding the traditional long training times and stability problems. The method is both simple and elegant: a random RNN (called the reservoir) is constructed using only a few global parameters to tune the dynamics into a desirable regime, and the dynamic response of the reservoir is used to train a simple linear regression function called the readout function - the reservoir itself remains untrained. This technique has shown some experimentally very convincing results on a variety of tasks, but a thorough understanding of the importance of the dynamics for the performance is still lacking. This contribution aims to extend this understanding, by presenting a more sophisticated extension on the traditional way of characterizing the reservoir dynamics, by using the dynamic profile of the Jacobian of the reservoir instead of static, a priori measures such as the standard spectral radius. We show that this measure gives a more accurate description of the reservoir dynamics, and can serve as predictor for the performance. Additionally, due to the theoretical background from dynamical systems theory, this measure offers some insight into the underlying mechanisms of RC.