Crafting a compiler
Fractals everywhere
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
The Induction of Dynamical Recognizers
Machine Learning - Connectionist approaches to language learning
Mechanisms of implicit learning: connectionist models of sequence processing
Mechanisms of implicit learning: connectionist models of sequence processing
Foundations of recurrent neural networks
Foundations of recurrent neural networks
On the computational power of neural nets
Journal of Computer and System Sciences
Exploring the computational capabilities of recurrent neural networks
Exploring the computational capabilities of recurrent neural networks
The dynamic universality of sigmoidal neural networks
Information and Computation
Analysis of dynamical recognizers
Neural Computation
Dynamical recognizers: real time language recognition by analog computers
Theoretical Computer Science
Recurrent neural networks can learn to implement symbol-sensitive counting
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Inference of Reversible Languages
Journal of the ACM (JACM)
Theory of Syntactic Recognition for Natural Languages
Theory of Syntactic Recognition for Natural Languages
Handbook of Formal Languages
Designing a Counter: Another Case Study of Dynamics and Activation Landscapes in Recurrent Networks
KI '97 Proceedings of the 21st Annual German Conference on Artificial Intelligence: Advances in Artificial Intelligence
Evolution of Neural Architecture Fitting Environmental Dynamics
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Elman Backpropagation as Reinforcement for Simple Recurrent Networks
Neural Computation
Neural Information Processing
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Extracting symbolic knowledge from recurrent neural networks---A fuzzy logic approach
Fuzzy Sets and Systems
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Training recurrent connectionist models on symbolic time series
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Neural Processing Letters
Fractal unfolding: a metamorphic approach to learning to parse recursive structure
CMCL '12 Proceedings of the 3rd Workshop on Cognitive Modeling and Computational Linguistics
Hi-index | 0.00 |
It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context-free language (CFL) by counting up and down. This article extends that to show a range of language tasks in which an SRN develops solutions that not only count but also copy and store counting information. In one case, the network stores information like an explicit storage mechanism. In other cases, the network stores information more indirectly in trajectories that are sensitive to slight displacements that depend on context. In this sense, an SRN can learn analog computation as a set of interdependent counters. This demonstrates how SRNs may be an alternative psychological model of language or sequence processing.