A maximum entropy language model integrating N-grams and topic dependencies for conversational speech recognition

  • Authors:
  • S. Khudanpur;Jun Wu

  • Affiliations:
  • Center for Language & Speech Process., Johns Hopkins Univ., Baltimore, MD, USA;-

  • Venue:
  • ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 01
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

A compact language model which incorporates local dependencies in the form of N-grams and long distance dependencies through dynamic topic conditional constraints is presented. These constraints are integrated using the maximum entropy principle. Issues in assigning a topic to a test utterance are investigated. Recognition results on the Switchboard corpus are presented showing that with a very small increase in the number of model parameters, reduction in word error rate and language model perplexity are achieved over trigram models. Some analysis follows, demonstrating that the gains are even larger on content-bearing words. The results are compared with those obtained by interpolating topic-independent and topic-specific N-gram models. The framework presented here extends easily to incorporate other forms of statistical dependencies such as syntactic word-pair relationships or hierarchical topic constraints.