Connectionism and cognitive architecture: a critical analysis
Connections and symbols
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Incremental parsing by modular recurrent connectionist networks
Advances in neural information processing systems 2
A PDP architecture for processing sentences with relative clauses
COLING '90 Proceedings of the 13th conference on Computational linguistics - Volume 3
Finite state automata and simple recurrent networks
Neural Computation
Hi-index | 0.00 |
In order to be taken seriously, connectionist natural language processing systems must be able to parse syntactically complex sentences. Current connectionist parsers either ignore structure or impose prior restrictions on the structural complexity of the sentences they can process -- either number of phrases or the "depth" of the sentence structure. XERIC networks, presented here, are distributed representation connectionist parsers which can analyze and represent syntactically varied sentences, including ones with recursive phrase structure constructs. No a priori limits are placed on the depth or length of sentences by the architecture. XERIC networks use recurrent networks to read words one at a time. RAAM style reduced descriptions and X-Bar grammar are used to make an economical syntactic representation scheme. This is combined with a training technique which allows XERIC to use multiple, virtual copies of its RAAM decoder network to learn to parse and represent sentence structure using gradient-descent methods. XERIC networks also perform number-person disambiguation and lexical disambiguation. Results show that the networks train to a few percent error for sentences up to a phrase-nesting depth of ten or more and that this performance generalizes well.