Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Incremental Syntactic Parsing of Natural Language Corpora with Simple Synchrony Networks
IEEE Transactions on Knowledge and Data Engineering
Segmenting state into entities and its implication for learning
Emergent neural computational architectures based on neuroscience
Segmenting State into Entities and Its Implication for Learning
Emergent Neural Computational Architectures Based on Neuroscience - Towards Neuroscience-Inspired Computing
Robustness beyond shallowness: incremental deep parsing
Natural Language Engineering
A neural network parser that handles sparse data
New developments in parsing technology
Lexical and structural biases for function parsing
Parsing '05 Proceedings of the Ninth International Workshop on Parsing Technology
Hi-index | 0.00 |
We present a connectionist architecture and demonstrate that it can learn syntactic parsing from a corpus of parsed text. The architecture can represent syntactic constituents, and can learn generalizations over syntactic constituents, thereby addressing the sparse data problems of previous connectionist architectures. We apply these Simple Synchrony Networks to mapping sequences of word tags to parse trees. After training on parsed samples of the Brown Corpus, the networks achieve precision and recall on constituents that approaches that of statistical methods for this task.