Compilers: principles, techniques, and tools
Compilers: principles, techniques, and tools
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Computer processing of natural language
Computer processing of natural language
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The neural network pushdown automation: model, stack and learning simulations
The neural network pushdown automation: model, stack and learning simulations
Natural language understanding (2nd ed.)
Natural language understanding (2nd ed.)
Statistical Language Learning
Theory of Syntactic Recognition for Natural Languages
Theory of Syntactic Recognition for Natural Languages
Learning PP attachment from corpus statistics
Connectionist, Statistical, and Symbolic Approaches to Learning for Natural Language Processing
Learning long-term dependencies in NARX recurrent neural networks
IEEE Transactions on Neural Networks
Structural bias in inducing representations for probabilistic natural language parsing
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Hi-index | 0.00 |
Connectionist holistic parsing offers a viable and attractive alternative to traditional algorithmic parsers. With exposure to a limited subset of grammatical sentences and their corresponding parse trees only, a holistic parser is capable of learning inductively the grammatical regularity underlying the training examples that affects the parsing process. In the past, various connectionist parsers have been proposed. Each approach had its own unique characteristics, and yet some techniques were shared in common. In this article, various dimensions underlying the design of a holistic parser are explored, including the methods to encode sentences and parse trees, whether a sentence and its corresponding parse tree share the same representation, the use of confluent inference, and the inclusion of phrases in the training set. Different combinations of these design factors give rise to different holistic parsers. In succeeding discussions, we scrutinize these design techniques and compare the performances of a few parsers on language parsing, including the confluent preorder parser, the backpropagation parsing network, the XERIC parser of Berg (1992), the modular connectionist parser of Sharkey and Sharkey (1992), Reilly's (1992) model, and their derivatives. Experiments are performed to evaluate their generalization capability and robustness. The results reveal a number of issues essential for building an effective holistic parser.