Elements of information theory
Elements of information theory
Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
Finite-state transducers in language and speech processing
Computational Linguistics
Statistical properties of probabilistic context-free grammars
Computational Linguistics
Entropy rate constancy in text
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
A probabilistic earley parser as a psycholinguistic model
NAACL '01 Proceedings of the second meeting of the North American Chapter of the Association for Computational Linguistics on Language technologies
Variation of entropy and parse trees of sentences as a function of the sentence number
EMNLP '03 Proceedings of the 2003 conference on Empirical methods in natural language processing
A TAG-based noisy channel model of speech repairs
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Attention shifting for parsing speech
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Head-driven parsing for word lattices
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Weighted and probabilistic context-free grammars are equally expressive
Computational Linguistics
Parsing '05 Proceedings of the Ninth International Workshop on Parsing Technology
OpenFst: a general and efficient weighted finite-state transducer library
CIAA'07 Proceedings of the 12th international conference on Implementation and application of automata
Statistical parsing with a context-free grammar and word statistics
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
A rational model of eye movement control in reading
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies - Volume 1
Hi-index | 0.00 |
Language comprehension, as with all other cases of the extraction of meaningful structure from perceptual input, takes places under noisy conditions. If human language comprehension is a rational process in the sense of making use of all available information sources, then we might expect uncertainty at the level of word-level input to affect sentence-level comprehension. However, nearly all contemporary models of sentence comprehension assume clean input---that is, that the input to the sentence-level comprehension mechanism is a perfectly-formed, completely certain sequence of input tokens (words). This article presents a simple model of rational human sentence comprehension under noisy input, and uses the model to investigate some outstanding problems in the psycholinguistic literature for theories of rational human sentence comprehension. We argue that by explicitly accounting for input-level noise in sentence processing, our model provides solutions for these outstanding problems and broadens the scope of theories of human sentence comprehension as rational probabilistic inference.