The Applicability of Recurrent Neural Networks for Biological Sequence Analysis
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Dynamics and Topographic Organization of Recursive Self-Organizing Maps
Neural Computation
The Crystallizing Substochastic Sequential Machine Extractor: CrySSMEx
Neural Computation
Elman Backpropagation as Reinforcement for Simple Recurrent Networks
Neural Computation
On Global Stability of Delayed BAM Stochastic Neural Networks with Markovian Switching
Neural Processing Letters
IEEE Transactions on Neural Networks
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Training recurrent connectionist models on symbolic time series
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Improving the state space organization of untrained recurrent networks
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Architectural and Markovian factors of echo state networks
Neural Networks
Dynamic background discrimination with a recurrent network
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
On non-markovian topographic organization of receptive fields in recursive self-organizing map
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
Stability analysis for discrete-time Markovian jump neural networks with mixed time-delays
Expert Systems with Applications: An International Journal
State estimation of markovian jump neural networks with mixed time delays
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Mean square exponential stability of hybrid neural networks with uncertain switching probabilities
ICIC'12 Proceedings of the 8th international conference on Intelligent Computing Theories and Applications
Neurocomputing
H∞ filtering of markovian jumping neural networks with time delays
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
pth Moment Exponential Stability of Stochastic Recurrent Neural Networks with Markovian Switching
Neural Processing Letters
Hi-index | 0.00 |
In this paper, we elaborate upon the claim that clustering in the recurrent layer of recurrent neural networks (RNNs) reflects meaningful information processing states even prior to training. By concentrating on activation clusters in RNNs, while not throwing away the continuous state space network dynamics, we extract predictive models that we call neural prediction machines (NPMs). When RNNs with sigmoid activation functions are initialized with small weights (a common technique in the RNN community), the clusters of recurrent activations emerging prior to training are indeed meaningful and correspond to Markov prediction contexts. In this case, the extracted NPMs correspond to a class of Markov models, called variable memory length Markov models (VLMMs). In order to appreciate how much information has really been induced during the training, the RNN performance should always be compared with that of VLMMs and NPMs extracted before training as the "null" base models. Our arguments are supported by experiments on a chaotic symbolic sequence and a context-free language with a deep recursive structure.