Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
A model for reasoning about persistence and causation
Computational Intelligence
Modern Information Retrieval
Language Model Adaptation Using Mixtures and an Exponentially Decaying Cache
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97)-Volume 2 - Volume 2
Dynamic bayesian networks: representation, inference and learning
Dynamic bayesian networks: representation, inference and learning
Exploiting syntactic structure for language modeling
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 1
Language modeling with sentence-level mixtures
HLT '94 Proceedings of the workshop on Human Language Technology
Combining Topic Information and Structure Information in a Dynamic Language Model
TSD '09 Proceedings of the 12th International Conference on Text, Speech and Dialogue
On the dynamic adaptation of stochastic language models
ICASSP'93 Proceedings of the 1993 IEEE international conference on Acoustics, speech, and signal processing: speech processing - Volume II
Hi-index | 0.00 |
In this paper we investigate whether a combination of topic specific language models can outperform a general purpose language model, using a trigram model as our baseline model. We show that in the ideal case -- in which it is known beforehand which model to use -- specific models perform considerably better than the baseline model. We test two methods that combine specific models and show that these combinations outperform the general purpose model, in particular if the data is diverse in terms of topics and vocabulary. Inspired by these findings, we propose to combine a decision tree and a set of dynamic Bayesian networks into a new model. The new model uses context information to dynamically select an appropriate specific model.