On multiple context-free grammars
Theoretical Computer Science
Foundations of statistical natural language processing
Foundations of statistical natural language processing
The theory of parsing, translation, and compiling
The theory of parsing, translation, and compiling
Practical experiments with regular approximation of context-free languages
Computational Linguistics - Special issue on finite-state methods in NLP
Finite-state transducers in language and speech processing
Computational Linguistics
The use of shared forests in tree adjoining grammar parsing
EACL '93 Proceedings of the sixth conference on European chapter of the Association for Computational Linguistics
The recognition capacity of local syntactic constraints
EACL '91 Proceedings of the fifth conference on European chapter of the Association for Computational Linguistics
Precise n-gram probabilities from stochastic context-free grammars
ACL '94 Proceedings of the 32nd annual meeting on Association for Computational Linguistics
Introduction to probabilistic automata (Computer science and applied mathematics)
Introduction to probabilistic automata (Computer science and applied mathematics)
Integration of speech recognition and natural language processing in the MIT VOYAGER system
ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
Applying Probability Measures to Abstract Languages
IEEE Transactions on Computers
Hi-index | 0.00 |
We show that under certain conditions, a language model can be trained on the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train an n-gram model on the basis of a probabilistic context-free grammar.