A Cache-Based Natural Language Model for Speech Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
IEEE Transactions on Information Theory
Variation of entropy and parse trees of sentences as a function of the sentence number
EMNLP '03 Proceedings of the 2003 conference on Empirical methods in natural language processing
A noisy-channel model of rational human sentence comprehension under uncertain input
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Bayesian unsupervised topic segmentation
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
A rational model of eye movement control in reading
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Close = relevant?: the role of context in efficient language production
CMCL '10 Proceedings of the 2010 Workshop on Cognitive Modeling and Computational Linguistics
Optimising incremental dialogue decisions using information density for interactive systems
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show that this effect has both lexical (which words are used) and non-lexical (how the words are used) causes.