Component-based discriminative classification for hidden Markov models

  • Authors:
  • Manuele Bicego;Elbieta Pekalska;David M. J. Tax;Robert P. W. Duin

  • Affiliations:
  • Computer Science Department, University of Verona, Strada le Grazie, 15, 37134 Verona, Italy and DEIR, University of Sassari, via Torre Tonda, 34, 07100 Sassari, Italy;School of Computer Science, University of Manchester, Oxford Road, M13 9PL Manchester, UK;Delft University of Technology, Mekelweg 4, 2628 CD Delft, The Netherlands;Delft University of Technology, Mekelweg 4, 2628 CD Delft, The Netherlands

  • Venue:
  • Pattern Recognition
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

Hidden Markov models (HMMs) have been successfully applied to a wide range of sequence modeling problems. In the classification context, one of the simplest approaches is to train a single HMM per class. A test sequence is then assigned to the class whose HMM yields the maximum a posterior (MAP) probability. This generative scenario works well when the models are correctly estimated. However, the results can become poor when improper models are employed, due to the lack of prior knowledge, poor estimates, violated assumptions or insufficient training data. To improve the results in these cases we propose to combine the descriptive strengths of HMMs with discriminative classifiers. This is achieved by training feature-based classifiers in an HMM-induced vector space defined by specific components of individual hidden Markov models. We introduce four major ways of building such vector spaces and study which trained combiners are useful in which context. Moreover, we motivate and discuss the merit of our method in comparison to dynamic kernels, in particular, to the Fisher Kernel approach.