Experience with a stack decoder-based HMM CSR and back-OFF N-gram language models

  • Authors:
  • Douglas B. Paul

  • Affiliations:
  • -

  • Venue:
  • HLT '91 Proceedings of the workshop on Speech and Natural Language
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

Stochastic language models are more useful than non-stochastic models because they contribute more information than a simple acceptance or rejection of a word sequence. Back-off N-gram language models [11] are an effective class of word based stochastic language model. The first part of this paper describes our experiences using the back-off language models in our time-synchronous decoder CSR. A bigram back-off language model was chosen for the language model to be used in the informal ATIS CSR baseline evaluation test[13, 21].The stack decoder[2, 8, 24] is a promising control structure for a speech understanding system because it can combine constraints from both the acoustic model and a long span language model (such as a natural language processor (NLP)) into a single integrated search[17]. A copy of the Lincoln time-synchronous HMM CSR has been converted to a stack decoder controlled search with stochastic language models. The second part of this paper describes our experiences with our prototype stack decoder CSR using no grammar, the word-pair grammar, and N-gram back-off language models.