ACM SIGART Bulletin
Neural Computation
Unsupervised Learning in LSTM Recurrent Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Recurrent and concurrent neural networks for objects recognition
AIA'06 Proceedings of the 24th IASTED international conference on Artificial intelligence and applications
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
A model of sensorimotor coordination in the rat whisker system
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
Anticipations, Brains, Individual and Social Behavior: An Introduction to Anticipatory Systems
Anticipatory Behavior in Adaptive Learning Systems
Hi-index | 0.00 |
This paper presents a set of techniques that allow generating a class of testbeds that can be used to test recurrent neural networks' capabilities of integrating information in time. In particular, the testbeds allow evaluating the capability of such models, and possibly other architectures and algorithms, of (a) categorizing different time series, (b) anticipating future signal levels on the basis of past ones, and (c) functioning robustly with respect to noise and other systematic random variations of the temporal and spatial properties of the input time series. The paper also presents a number of analysis tools that can be used to understand the functioning and organization of the dynamical internal representations that recurrent neural networks develop to acquire the aforementioned capabilities, including periodicity, repetitions, spikes, and levels and rates of change of input signals. The utility of the proposed testbeds is illustrated by testing and studying the capacity of Elman neural networks to predict and categorize different signals in two exemplary tasks.