Handbook of formal languages, vol. 3
Statistical methods for speech recognition
Statistical methods for speech recognition
Probabilistic Languages: A Review and Some Open Questions
ACM Computing Surveys (CSUR)
Statistical properties of probabilistic context-free grammars
Computational Linguistics
Relating probabilistic grammars and automata
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Head-Driven Statistical Models for Natural Language Parsing
Computational Linguistics
A Note on the Expressive Power of Probabilistic Context Free Grammars
Journal of Logic, Language and Information
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Hierarchical Phrase-Based Translation
Computational Linguistics
Applying Probability Measures to Abstract Languages
IEEE Transactions on Computers
Weighted and probabilistic context-free grammars are equally expressive
Computational Linguistics
Wide-coverage efficient statistical parsing with ccg and log-linear models
Computational Linguistics
Hi-index | 0.00 |
Over the last decade, probabilistic parsing has become the standard in the parsing literature where one of the purposes of those probabilities is to discard unlikely parses. We investigate the effect that discarding low probability parses has on both the weak and strong generative power of context-free grammars. We prove that probabilistic context-free grammars are more powerful than their non-probabilistic counterparts but in a way that is orthogonal to the Chomsky hierarchy. In particular, we show that the increase in power cannot be used to model any dependencies that discrete context-free grammars cannot.