A massively parallel self-tuning context-free parser
Advances in neural information processing systems 1
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Artificial Intelligence - On connectionist symbol processing
Learning and applying contextual constraints in sentence comprehension
Artificial Intelligence - On connectionist symbol processing
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks
Machine Learning - Connectionist approaches to language learning
The neural network pushdown automation: model, stack and learning simulations
The neural network pushdown automation: model, stack and learning simulations
Natural language understanding (2nd ed.)
Natural language understanding (2nd ed.)
Exploring the computational capabilities of recurrent neural networks
Exploring the computational capabilities of recurrent neural networks
How to design a connectionist holistic parser
Neural Computation
Statistical Language Learning
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Natural Language Grammatical Inference with Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Confluent Preorder Parsing
Subsymbolic Case-Role Analysis of Sentences with Embedded Clauses
Subsymbolic Case-Role Analysis of Sentences with Embedded Clauses
Finite-state approximation of phrase structure grammars
ACL '91 Proceedings of the 29th annual meeting on Association for Computational Linguistics
Learning long-term dependencies in NARX recurrent neural networks
IEEE Transactions on Neural Networks
Broad-Coverage Parsing with Neural Networks
Neural Processing Letters
Hi-index | 0.00 |
Holistic parsers offer a viable alternative to traditional algorithmic parsers. They have good generalization performance and are robust inherently. In a holistic parser, parsing is achieved by mapping the connectionist representation of the input sentence to the connectionist representation of the target parse tree directly. Little prior knowledge of the underlying parsing mechanism thus needs to be assumed. However, it also makes holistic parsing difficult to understand. In this article, an analysis is presented for studying the operations of the confluent preorder parser (CPP). In the analysis, the CPP is viewed as a dynamical system, and holistic parsing is perceived as a sequence of state transitions through its state-space. The seemingly one-shot parsing mechanism can thus be elucidated as a step-by-step inference process, with the intermediate parsing decisions being reflected by the states visited during parsing. The study serves two purposes. First, it improves our understanding of how grammatical errors are corrected by the CPP. The occurrence of an error in a sentence will cause the CPP to deviate from the normal track that is followed when the original sentence is parsed. But as the remaining terminals are read, the two trajectories will gradually converge until finally the correct parse tree is produced. Second, it reveals that having systematic parse tree representations alone cannot guarantee good generalization performance in holistic parsing. More important, they need to be distributed in certain useful locations of the representational space. Sentences with similar trailing terminals should have their corresponding parse tree representations mapped to nearby locations in the representational space. The study provides concrete evidence that encoding the linearized parse trees as obtained via preorder traversal can satisfy such a requirement.