Initializing reservoirs with exhibitory and inhibitory signals using unsupervised learning techniques

  • Authors:
  • Sebastián Basterrech;Václav Snášel

  • Affiliations:
  • VŠB-Technical University of Ostrava, Ostrava, Czech Republic;VŠB-Technical University of Ostrava, Ostrava, Czech Republic

  • Venue:
  • Proceedings of the Fourth Symposium on Information and Communication Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The trend of Reservoir Computing (RC) has been gaining prominence in the Neural Computation community since the 2000s. In a RC model there are at least two well-differentiated structures. One is a recurrent part called reservoir, which expands the input data and historical information into a high-dimensional space. This projection is carried out in order to enhance the linear separability of the input data. Another part is a memory-less structure designed to be robust and fast in the learning process. RC models are an alternative of Turing Machines and Recurrent Neural Networks to model cognitive processing in the neural system. Additionally, they are interesting Machine Learning tools to Time Series Modeling and Forecasting. Recently a new RC model was introduced under the name of Echo State Queueing Networks (ESQN). In this model the reservoir is a dynamical system which arises from the Queueing Theory. The initialization of the reservoir parameters may influence the model performance. Recently, some unsupervised techniques were used to improve the performance of one specific RC method. In this paper, we apply these techniques to set the reservoir parameters of the ESQN model. In particular, we study the ESQN model initialization using Self-Organizing Maps. Additionally, we test the model performance initializing the reservoir employing Hebbian rules. We present an empirical comparison of these reservoir initializations using a range of time series benchmarks.