Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Analog computation via neural networks
Theoretical Computer Science
The simple dynamics of super Turing theories
Theoretical Computer Science - Special issue on universal machines and computations
Dynamical recognizers: real time language recognition by analog computers
Theoretical Computer Science
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Learning exponential state-growth languages by hill climbing
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We describe a computational framework for language learning and parsing in which dynamical systems navigate on fractal sets. We explore the predictions of the framework in an artificial grammar task in which humans and recurrent neural networks are trained on a language with recursive structure. The results provide evidence for the claim of the dynamical systems models that grammatical systems continuously metamorphose during learning. The present perspective permits structural comparison between the recursive representations in symbolic and neural network models.