Learning in the recurrent random neural network
Neural Computation
Self-Organizing Maps
Hebbian Learning And Negative Feedback Networks (Advanced Information and Knowledge Processing)
Hebbian Learning And Negative Feedback Networks (Advanced Information and Knowledge Processing)
Training Recurrent Networks by Evolino
Neural Computation
Improving reservoirs using intrinsic plasticity
Neurocomputing
The Computer Journal
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
Identification and control of dynamical systems using neural networks
IEEE Transactions on Neural Networks
Minimum Complexity Echo State Network
IEEE Transactions on Neural Networks
Learning long-term dependencies with gradient descent is difficult
IEEE Transactions on Neural Networks
Recurrent neural networks and robust time series prediction
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The trend of Reservoir Computing (RC) has been gaining prominence in the Neural Computation community since the 2000s. In a RC model there are at least two well-differentiated structures. One is a recurrent part called reservoir, which expands the input data and historical information into a high-dimensional space. This projection is carried out in order to enhance the linear separability of the input data. Another part is a memory-less structure designed to be robust and fast in the learning process. RC models are an alternative of Turing Machines and Recurrent Neural Networks to model cognitive processing in the neural system. Additionally, they are interesting Machine Learning tools to Time Series Modeling and Forecasting. Recently a new RC model was introduced under the name of Echo State Queueing Networks (ESQN). In this model the reservoir is a dynamical system which arises from the Queueing Theory. The initialization of the reservoir parameters may influence the model performance. Recently, some unsupervised techniques were used to improve the performance of one specific RC method. In this paper, we apply these techniques to set the reservoir parameters of the ESQN model. In particular, we study the ESQN model initialization using Self-Organizing Maps. Additionally, we test the model performance initializing the reservoir employing Hebbian rules. We present an empirical comparison of these reservoir initializations using a range of time series benchmarks.