Rethinking language models within the framework of dynamic bayesian networks

  • Authors:
  • Murat Deviren;Khalid Daoudi;Kamel Smaïli

  • Affiliations:
  • Parole team, INRIA-LORIA, Villers les Nancy, France;Parole team, INRIA-LORIA, Villers les Nancy, France;Parole team, INRIA-LORIA, Villers les Nancy, France

  • Venue:
  • AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a new approach for language modeling based on dynamic Bayesian networks The philosophy behind this architecture is to learn from data the appropriate relations of dependency between the linguistic variables used in language modeling process It is an original and coherent framework that processes words and classes in the same model This approach leads to new data-driven language models capable of outperforming classical ones, sometimes with lower computational complexity We present experiments on a small and medium corpora The results show that this new technique is very promising and deserves further investigations.