Self-organized language modeling for speech recognition
Readings in speech recognition
Class-based n-gram models of natural language
Computational Linguistics
Learning the structure of dynamic probabilistic networks
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
We present a new approach for language modeling based on dynamic Bayesian networks The philosophy behind this architecture is to learn from data the appropriate relations of dependency between the linguistic variables used in language modeling process It is an original and coherent framework that processes words and classes in the same model This approach leads to new data-driven language models capable of outperforming classical ones, sometimes with lower computational complexity We present experiments on a small and medium corpora The results show that this new technique is very promising and deserves further investigations.