A model of corticostriatal plasticity for learning oculomotor associations and sequences
Journal of Cognitive Neuroscience
Attractor Landscapes and Active Tracking: The Neurodynamics of Embodied Action
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Can't get you out of my head: a connectionist model of cyclic rehearsal
ZiF'06 Proceedings of the Embodied communication in humans and machines, 2nd ZiF research group international conference on Modeling communication with robots and virtual humans
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
Trajectory generation and modulation using dynamic neural networks
IEEE Transactions on Neural Networks
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Re-visiting the echo state property
Neural Networks
On-Line processing of grammatical structure using reservoir computing
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
Neurodynamical models of working memory (WM) should provide mechanisms for storing, maintaining, retrieving, and deleting information. Many models address only a subset of these aspects. Here we present a rather simple WM model in which all of these performance modes are trained into a recurrent neural network (RNN) of the echo state network (ESN) type. The model is demonstrated on a bracket level parsing task with a stream of rich and noisy graphical script input. In terms of nonlinear dynamics, memory states correspond, intuitively, to attractors in an input-driven system. As a supplementary contribution, the article proposes a rigorous formal framework to describe such attractors, generalizing from the standard definition of attractors in autonomous (input-free) dynamical systems.