Support Vector Machine Training for Improved Hidden Markov Modeling

  • Authors:
  • A. Sloin;D. Burshtein

  • Affiliations:
  • Tel Aviv Univ., Tel Aviv;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2008

Quantified Score

Hi-index 35.69

Visualization

Abstract

We present a discriminative training algorithm, that uses support vector machines (SVMs), to improve the classification of discrete and continuous output probability hidden Markov models (HMMs). The algorithm uses a set of maximum-likelihood (ML) trained HMM models as a baseline system, and an SVM training scheme to rescore the results of the baseline HMMs. It turns out that the rescoring model can be represented as an unnormalized HMM. We describe two algorithms for training the unnormalized HMM models for both the discrete and continuous cases. One of the algorithms results in a single set of unnormalized HMMs that can be used in the standard recognition procedure (the Viterbi recognizer), as if they were plain HMMs. We use a toy problem and an isolated noisy digit recognition task to compare our new method to standard ML training. Our experiments show that SVM rescoring of hidden Markov models typically reduces the error rate significantly compared to standard ML training.