A discriminative model corresponding to hierarchical HMMs

  • Authors:
  • Takaaki Sugiura;Naoto Goto;Akira Hayashi

  • Affiliations:
  • Graduate School of Information Sciences, Hiroshima City University, Asaminami-ku, Hiroshima, Japan;Graduate School of Information Sciences, Hiroshima City University, Asaminami-ku, Hiroshima, Japan;Graduate School of Information Sciences, Hiroshima City University, Asaminami-ku, Hiroshima, Japan

  • Venue:
  • IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hidden Markov Models (HMMs) are very popular generative models for sequence data. Recent work has, however, shown that on many tasks, Conditional Random Fields (CRFs), a type of discriminative model, perform better than HMMs. We propose Hierarchical Hidden Conditional Random Fields (HHCRFs), a discriminative model corresponding to hierarchical HMMs (HHMMs). HHCRFs model the conditional probability of the states at the upper levels given observations. The states at the lower levels are hidden and marginalized in the model definition. We have developed two algorithms for the model: a parameter learning algorithm that needs only the states at the upper levels in the training data and the marginalized Viterbi algorithm, which computes the most likely state sequences at the upper levels by marginalizing the states at the lower levels. In an experiment that involves segmenting electroencephalographic (EEG) data for a Brain-Computer Interface, HHCRFs outperform HHMMs.