Context-free parsing with connectionist networks
AIP Conference Proceedings 151 on Neural Networks for Computing
Mechanisms of sentence processing: assigning roles to constituents
Parallel distributed processing: explorations in the microstructure of cognition, vol. 2
High level knowledge sources in usable speech recognition systems
Communications of the ACM
Computation of language: an essay on syntax, semantics and pragmatics in natural man-machine communication
Incremental parsing by modular recurrent connectionist networks
Advances in neural information processing systems 2
Optimizing initial configurations of neural networks for the task of natural language learning
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Journal of Artificial Intelligence Research
Hybrid thematic role processor: symbolic linguistic relations revised by connectionist learning
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.02 |
A modular, recurrent connectionist network is taught to incrementally parse complex sentences. From input presented one word at a time, the network learns to do semantic role assignment, noun phrase attachment, and clause structure recognition, for sentences with both active and passive constructions and center-embedded clauses. The network makes syntactic and semantic predictions at every step. Previous predictions are revised as expectations are confirmed or violated with the arrival of new information. The network induces its own grammar rules for dynamically transforming an input sequence of words into a syntactic/semantic interpretation. The network generalizes well and is tolerant of ill-formed inputs.