Computation at the edge of chaos: phase transitions and emergent computation
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Real-time computation at the edge of chaos in recurrent neural networks
Neural Computation
Movement Generation with Circuits of Spiking Neurons
Neural Computation
Isolated word recognition with the liquid state machine: a case study
Information Processing Letters - Special issue on applications of spiking neural networks
Synergies Between Intrinsic and Synaptic Plasticity Mechanisms
Neural Computation
Improving reservoirs using intrinsic plasticity
Neurocomputing
Neural Computation
Error-backpropagation in networks of fractionally predictive spiking neurons
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Simple deterministically constructed cycle reservoirs with regular jumps
Neural Computation
Re-visiting the echo state property
Neural Networks
Randomly connected networks have short temporal memory
Neural Computation
Hi-index | 0.00 |
Reservoir computing (RC) systems are powerful models for online computations on input sequences. They consist of a memoryless readout neuron that is trained on top of a randomly connected recurrent neural network. RC systems are commonly used in two flavors: with analog or binary (spiking) neurons in the recurrent circuits. Previous work indicated a fundamental difference in the behavior of these two implementations of the RC idea. The performance of an RC system built from binary neurons seems to depend strongly on the network connectivity structure. In networks of analog neurons, such clear dependency has not been observed. In this letter, we address this apparent dichotomy by investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks. Our analyses are based on a novel estimation of the Lyapunov exponent of the network dynamics with the help of branching process theory, rank measures that estimate the kernel quality and generalization capabilities of recurrent networks, and a novel mean field predictor for computational performance. These analyses reveal that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to differences in the integration of information over short and long timescales. This explains the decreased computational performance observed in binary circuits that are densely connected. The mean field predictor is also used to bound the memory function of recurrent circuits of binary neurons.