Artificial Intelligence - On connectionist symbol processing
The Journal of Machine Learning Research
Probabilistic top-down parsing and language modeling
Computational Linguistics
A probabilistic earley parser as a psycholinguistic model
NAACL '01 Proceedings of the second meeting of the North American Chapter of the Association for Computational Linguistics on Language technologies
A distributional model of semantic context effects in lexical processing
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Dependency-Based Construction of Semantic Space Models
Computational Linguistics
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Language models based on semantic composition
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
The influence of discourse on syntax a psycholinguistic model of sentence processing
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Cognitively plausible models of human language processing
ACLShort '10 Proceedings of the ACL 2010 Conference Short Papers
Holographic reduced representations
IEEE Transactions on Neural Networks
Cognitively plausible models of human language processing
ACLShort '10 Proceedings of the ACL 2010 Conference Short Papers
Compositional expectation: a purely distributional model of compositional semantics
IWCS '11 Proceedings of the Ninth International Conference on Computational Semantics
A model of discourse predictions in human sentence processing
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Incremental, predictive parsing with psycholinguistically motivated tree-adjoining grammar
Computational Linguistics
Hi-index | 0.00 |
The analysis of reading times can provide insights into the processes that underlie language comprehension, with longer reading times indicating greater cognitive load. There is evidence that the language processor is highly predictive, such that prior context allows upcoming linguistic material to be anticipated. Previous work has investigated the contributions of semantic and syntactic contexts in isolation, essentially treating them as independent factors. In this paper we analyze reading times in terms of a single predictive measure which integrates a model of semantic composition with an incremental parser and a language model.