An efficient probabilistic context-free parsing algorithm that computes prefix probabilities
Computational Linguistics
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
Computational Linguistics
COLING '88 Proceedings of the 12th conference on Computational linguistics - Volume 1
A probabilistic earley parser as a psycholinguistic model
NAACL '01 Proceedings of the second meeting of the North American Chapter of the Association for Computational Linguistics on Language technologies
Weighted and probabilistic context-free grammars are equally expressive
Computational Linguistics
A noisy-channel model of rational human sentence comprehension under uncertain input
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Computing Partition Functions of PCFGs
Research on Language and Computation
Besting the quiz master: crowdsourcing incremental classification games
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
A system making optimal use of available information in incremental language comprehension might be expected to use linguistic knowledge together with current input to revise beliefs about previous input. Under some circumstances, such an error-correction capability might induce comprehenders to adopt grammatical analyses that are inconsistent with the true input. Here we present a formal model of how such input-unfaithful garden paths may be adopted and the difficulty incurred by their subsequent disconfirmation, combining a rational noisy-channel model of syntactic comprehension under uncertain input with the surprisal theory of incremental processing difficulty. We also present a behavioral experiment confirming the key empirical predictions of the theory.