Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Learning and applying contextual constraints in sentence comprehension
Artificial Intelligence - On connectionist symbol processing
Subsymbolic natural language processing: an integrated model of scripts, lexicon, and memory
Subsymbolic natural language processing: an integrated model of scripts, lexicon, and memory
Natural Language Grammatical Inference with Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Incremental Syntactic Parsing of Natural Language Corpora with Simple Synchrony Networks
IEEE Transactions on Knowledge and Data Engineering
A Connectionist Simulation of the Empirical Acquisition of Grammatical Relations
Hybrid Neural Systems, revised papers from a workshop
Combining Maps and Distributed Representations for Shift-Reduce Parsing
Hybrid Neural Systems, revised papers from a workshop
Clause identification with long short-term memory
ConLL '01 Proceedings of the 2001 workshop on Computational Natural Language Learning - Volume 7
Hi-index | 0.00 |
Subsymbolic systems have been successfully used to model several aspects of human language processing. Such parsers are appealing because they allow revising the interpretation as words are incrementally processed. Yet, it has been very hard to scale them up to realistic language due to training time, limited memory, and the difficulty of representing linguistic structure. In this study, we show that it is possible to keep track of long-distance dependencies and to parse into deeper structures than before based on two techniques: a localist encoding of the input sequence and a dynamic unrolling of the network according to the parse tree. With these techniques, the system can nonmonotonically parse a corpus of realistic sentences into parse trees labelled with grammatical tags from a broad-coverage Head-driven Phrase Structure Grammar of English.