Upper Bound Kullback–Leibler Divergence for Transient Hidden Markov Models

  • Authors:
  • J. Silva;S. Narayanan

  • Affiliations:
  • Dept. of Electr. Eng., Southern California Univ., Los Angeles, CA;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2008

Quantified Score

Hi-index 35.68

Visualization

Abstract

This paper reports an upper bound for the Kullback-Leibler divergence (KLD) for a general family of transient hidden Markov models (HMMs). An upper bound KLD (UBKLD) expression for Gaussian mixtures models (GMMs) is presented which is generalized for the case of HMMs. Moreover, this formulation is extended to the case of HMMs with nonemitting states, where under some general assumptions, the UBKLD is proved to be well defined for a general family of transient models. In particular, the UBKLD has a computationally efficient closed-form for HMMs with left-to-right topology and a final nonemitting state, that we refer to as left-to-right transient HMMs. Finally, the usefulness of the closed-form expression is experimentally evaluated for automatic speech recognition (ASR) applications, where left-to-right transient HMMs are used to model basic acoustic-phonetic units. Results show that the UBKLD is an accurate discrimination indicator for comparing acoustic HMMs used for ASR.