Matrix analysis
Computation of Probabilities for an Island-Driven Parser
IEEE Transactions on Pattern Analysis and Machine Intelligence
An efficient probabilistic context-free parsing algorithm that computes prefix probabilities
Computational Linguistics
IEEE Transactions on Pattern Analysis and Machine Intelligence
Foundations of statistical natural language processing
Foundations of statistical natural language processing
Computation of the probability of initial substring generation by stochastic context-free grammars
Computational Linguistics
Estimation of consistent probabilistic context-free grammars
HLT-NAACL '06 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics
Recursive Markov chains, stochastic grammars, and monotone systems of nonlinear equations
Journal of the ACM (JACM)
PReMo: an analyzer for probabilistic recursive models
TACAS'07 Proceedings of the 13th international conference on Tools and algorithms for the construction and analysis of systems
Computing the Least Fixed Point of Positive Polynomial Systems
SIAM Journal on Computing
Computation of infix probabilities for probabilistic context-free grammars
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Model Checking of Recursive Probabilistic Systems
ACM Transactions on Computational Logic (TOCL)
Polynomial time algorithms for multi-type branching processesand stochastic context-free grammars
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
ICALP'12 Proceedings of the 39th international colloquium conference on Automata, Languages, and Programming - Volume Part I
Hi-index | 0.00 |
We study the problem of computing the probability that a given stochastic context-free grammar (SCFG), G, generates a string in a given regular language L(D) (given by a DFA, D). This basic problem has a number of applications in statistical natural language processing, and it is also a key necessary step towards quantitative ω-regular model checking of stochastic context-free processes (equivalently, 1-exit recursive Markov chains, or stateless probabilistic pushdown processes). We show that the probability that G generates a string in L(D) can be computed to within arbitrary desired precision in polynomial time (in the standard Turing model of computation), under a rather mild assumption about the SCFG, G, and with no extra assumption about D. We show that this assumption is satisfied for SCFG's whose rule probabilities are learned via the well-known inside-outside (EM) algorithm for maximum-likelihood estimation (a standard method for constructing SCFGs in statistical NLP and biological sequence analysis). Thus, for these SCFGs the algorithm always runs in P-time.