Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Language as a dynamical system
Mind as motion
On the effect of analog noise in discrete-time analog computations
Neural Computation
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Learning the Dynamics of Embedded Clauses
Applied Intelligence
LSTM recurrent networks learn simple context-free and context-sensitive languages
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We give a necessary condition that a simple recurrent neural network with two sigmoidal hidden units to implement a recognizer of the formal language {anbn| n 0 } which is generated by a set of generating rules {S茂戮驴aSb, S茂戮驴ab} and show that by setting parameters so as to conform to the condition we get a recognizer of the language. The condition implies instability of learning process reported in previous studies. The condition also implies, contrary to its success in implementing the recognizer, difficulty of getting a recognizer of more complicated languages.