Accelerated learning in layered neural networks
Complex Systems
Learning and applying contextual constraints in sentence comprehension
Artificial Intelligence - On connectionist symbol processing
Introduction to the theory of neural computation
Introduction to the theory of neural computation
On learning the past tenses of english verbs
Parallel distributed processing
Higher order recurrent networks and grammatical inference
Advances in neural information processing systems 2
Note on learning rate schedules for stochastic optimization
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
The Induction of Dynamical Recognizers
Machine Learning - Connectionist approaches to language learning
Local feedback multilayered networks
Neural Computation
Induction of finite-state languages using second-order recurrent networks
Neural Computation
Learning finite machines with self-clustering recurrent networks
Neural Computation
Combining Symbolic and Neural Learning
Machine Learning
On the computational power of neural nets
Journal of Computer and System Sciences
Extraction of rules from discrete-time recurrent neural networks
Neural Networks
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
The handbook of brain theory and neural networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Introduction to Automata Theory, Languages and Computability
Introduction to Automata Theory, Languages and Computability
Introduction to Formal Language Theory
Introduction to Formal Language Theory
Unified Integration of Explicit Knowledge and Learning by Example in Recurrent Networks
IEEE Transactions on Knowledge and Data Engineering
Rule Revision With Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Neural Network Classification and Prior Class Probabilities
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Connectionist, Statistical, and Symbolic Approaches to Learning for Natural Language Processing
Inside-outside reestimation from partially bracketed corpora
ACL '92 Proceedings of the 30th annual meeting on Association for Computational Linguistics
Computational capabilities of recurrent NARX neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Computational capabilities of local-feedback recurrent networks acting as finite-state machines
IEEE Transactions on Neural Networks
Incremental Syntactic Parsing of Natural Language Corpora with Simple Synchrony Networks
IEEE Transactions on Knowledge and Data Engineering
Linguistic Relations Encoding in a Symbolic-Connectionist Hybrid Natural Language Processor
IBERAMIA-SBIA '00 Proceedings of the International Joint Conference, 7th Ibero-American Conference on AI: Advances in Artificial Intelligence
A Paradox of Neural Encoders and Decoders or Why Don't We Talk Backwards?
SEAL'98 Selected papers from the Second Asia-Pacific Conference on Simulated Evolution and Learning on Simulated Evolution and Learning
Automatic Adaptation of a Natural Language Interface to a Robotic System
IBERAMIA 2002 Proceedings of the 8th Ibero-American Conference on AI: Advances in Artificial Intelligence
Broad-Coverage Parsing with Neural Networks
Neural Processing Letters
Learning Beyond Finite Memory in Recurrent Networks of Spiking Neurons
Neural Computation
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Hybrid thematic role processor: symbolic linguistic relations revised by connectionist learning
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Extracting finite structure from infinite language
Knowledge-Based Systems
Neural Networks
A resources virtualization approach supporting uniform access to heterogeneous grid resources
EUROCAST'07 Proceedings of the 11th international conference on Computer aided systems theory
A cognitive interactionist sentence parser with simple recurrent networks
Information Sciences: an International Journal
Learning beyond finite memory in recurrent networks of spiking neurons
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
An HMM-SNN method for online handwriting symbol recognition
ICIAR'06 Proceedings of the Third international conference on Image Analysis and Recognition - Volume Part II
Inferring grammar rules of programming language dialects
ICGI'06 Proceedings of the 8th international conference on Grammatical Inference: algorithms and applications
Optimising search engines using evolutionally adapted language models in typed dependency parses
SIDE'12 Proceedings of the 2012 international conference on Swarm and Evolutionary Computation
The Journal of Supercomputing
Hi-index | 0.00 |
This paper examines the inductive inference of a complex grammar with neural networks驴specifically, the task considered is that of training a network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government-and-Binding theory. Neural networks are trained, without the division into learned vs. innate components assumed by Chomsky, in an attempt to produce the same judgments as native speakers on sharply grammatical/ungrammatical data. How a recurrent neural network could possess linguistic capability and the properties of various common recurrent neural network architectures are discussed. The problem exhibits training behavior which is often not present with smaller grammars and training was initially difficult. However, after implementing several techniques aimed at improving the convergence of the gradient descent backpropagation-through-time training algorithm, significant learning was possible. It was found that certain architectures are better able to learn an appropriate grammar. The operation of the networks and their training is analyzed. Finally, the extraction of rules in the form of deterministic finite state automata is investigated.