The power of amnesia: learning probabilistic automata with variable memory length
Machine Learning - Special issue on COLT '94
Improving reservoirs using intrinsic plasticity
Neurocomputing
Predictive Modeling with Echo State Networks
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Short term memory and pattern matching with simple echo state networks
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
New results on recurrent network training: unifying the algorithms and accelerating convergence
IEEE Transactions on Neural Networks
Markovian architectural bias of recurrent neural networks
IEEE Transactions on Neural Networks
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Bayesian filter based behavior recognition in workflows allowing for user feedback
Computer Vision and Image Understanding
Neurocomputing
A method for online analysis of structured processes using bayesian filters and echo state networks
ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume Part III
Modular state space of echo state network
Neurocomputing
Hi-index | 0.00 |
Echo State Networks (ESNs) constitute an emerging approach for efficiently modeling Recurrent Neural Networks (RNNs). In this paper we investigate some of the main aspects that can be accounted for the success and limitations of this class of models. In particular, we propose complementary classes of factors related to contractivity and architecture of reservoirs and we study their relative relevance. First, we show the existence of a class of tasks for which ESN performance is independent of the architectural design. The effect of the Markovian factor, characterizing a significant class within these cases, is shown by introducing instances of easy/hard tasks for ESNs featured by contractivity of reservoir dynamics. In the complementary cases, for which architectural design is effective, we investigate and decompose the aspects of network design that allow a larger reservoir to progressively improve the predictive performance. In particular, we introduce four key architectural factors: input variability, multiple time-scales dynamics, non-linear interactions among units and regression in an augmented feature space. To investigate the quantitative effects of the different architectural factors within this class of tasks successfully approached by ESNs, variants of the basic ESN model are proposed and tested on instances of datasets of different nature and difficulty. Experimental evidences confirm the role of the Markovian factor and show that all the identified key architectural factors have a major role in determining ESN performances.