Real-time computation at the edge of chaos in recurrent neural networks
Neural Computation
Analysis and design of echo state networks
Neural Computation
Training Recurrent Networks by Evolino
Neural Computation
Improving reservoirs using intrinsic plasticity
Neurocomputing
Predictive Modeling with Echo State Networks
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Stable Output Feedback in Reservoir Computing Using Ridge Regression
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Pruning and regularization in reservoir computing
Neurocomputing
The introduction of time-scales in reservoir computing, applied to isolated digits recognition
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Short term memory and pattern matching with simple echo state networks
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
New results on recurrent network training: unifying the algorithms and accelerating convergence
IEEE Transactions on Neural Networks
Collective Behavior of a Small-World Recurrent Neural System With Scale-Free Distribution
IEEE Transactions on Neural Networks
Minimum Complexity Echo State Network
IEEE Transactions on Neural Networks
Model-based kernel for efficient time series analysis
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
A new class of state-space models, reservoir models, with a fixed state transition structure (the "reservoir") and an adaptable readout from the state space, has recently emerged as a way for time series processing and modeling. Echo state network (ESN) is one of the simplest, yet powerful, reservoir models. ESN models are generally constructed in a randomized manner. In our previous study (Rodan & Tiňo, 2011), we showed that a very simple, cyclic, deterministically generated reservoir can yield performance competitive with standard ESN. In this contribution, we extend our previous study in three aspects. First, we introduce a novel simple deterministic reservoir model, cycle reservoir with jumps (CRJ), with highly constrained weight values, that has superior performance to standard ESN on a variety of temporal tasks of different origin and characteristics. Second, we elaborate on the possible link between reservoir characterizations, such as eigenvalue distribution of the reservoir matrix or pseudo-Lyapunov exponent of the input-driven reservoir dynamics, and the model performance. It has been suggested that a uniform coverage of the unit disk by such eigenvalues can lead to superior model performance. We show that despite highly constrained eigenvalue distribution, CRJ consistently outperforms ESN (which has much more uniform eigenvalue coverage of the unit disk). Also, unlike in the case of ESN, pseudo-Lyapunov exponents of the selected optimal CRJ models are consistently negative. Third, we present a new framework for determining the short-term memory capacity of linear reservoir models to a high degree of precision. Using the framework, we study the effect of shortcut connections in the CRJ reservoir topology on its memory capacity.