Improving recurrent neural network performance using transfer entropy
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Architectural and Markovian factors of echo state networks
Neural Networks
Hi-index | 0.00 |
We are interested in the optimization of the recurrent connection structure of Echo State Networks (ESNs), because their topology can strongly influence performance. We study ESN predictive capacity by numerical simulations on Mackey-Glass time series, and find that a particular small subset of ESNs is much better than ordinary ESNs provided that the topology of the recurrent feedback connections satisfies certain conditions. We argue that the small subset separates two large sets of ESNs and this separation can be characterized in terms of phase transitions. With regard to the criticality of this phase transition, we introduce the notion of Critical Echo State Networks (CESN). We discuss why CESNs perform better than other ESNs.