Simple deterministically constructed cycle reservoirs with regular jumps
Neural Computation
Neural Networks
Reservoir sizes and feedback weights interact non-linearly in echo state networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Design strategies for weight matrices of echo state networks
Neural Computation
Subspace echo state network for multivariate time series prediction
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
Short term memory in input-driven linear dynamical systems
Neurocomputing
Proceedings of the Fourth Symposium on Information and Communication Technology
Hi-index | 0.00 |
Reservoir computing (RC) refers to a new class of state-space models with a fixed state transition structure (the reservoir) and an adaptable readout form the state space. The reservoir is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be exploited by the reservoir-to-output readout mapping. The field of RC has been growing rapidly with many successful applications. However, RC has been criticized for not being principled enough. Reservoir construction is largely driven by a series of randomized model-building stages, with both researchers and practitioners having to rely on a series of trials and errors. To initialize a systematic study of the field, we concentrate on one of the most popular classes of RC methods, namely echo state network, and ask: What is the minimal complexity of reservoir construction for obtaining competitive models and what is the memory capacity (MC) of such simplified reservoirs? On a number of widely used time series benchmarks of different origin and characteristics, as well as by conducting a theoretical analysis we show that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology. The (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.