The Hierarchical Hidden Markov Model: Analysis and Applications

  • Authors:
  • Shai Fine;Yoram Singer;Naftali Tishby

  • Affiliations:
  • Institute of Computer Science, Hebrew University, Jerusalem 91904, Israel. E-mail: fshai@cs.huji.ac.il;AT&T Labs, 180 Park Avenue, Florham Park, NJ 07932. E-mail: singer@research.att.com;Institute of Computer Science, Hebrew University, Jerusalem 91904, Israel. E-mail: tishby@cs.huji.ac.il

  • Venue:
  • Machine Learning
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce, analyze and demonstrate a recursive hierarchicalgeneralization of the widely used hidden Markov models, which we nameHierarchical Hidden Markov Models (HHMM). Our model is motivated by thecomplex multi-scale structure which appears in many natural sequences,particularly in language, handwriting and speech. We seek a systematicunsupervised approach to the modeling of such structures. By extending thestandard Baum-Welch (forward-backward) algorithm, we derive an efficientprocedure for estimating the model parameters from unlabeled data. We thenuse the trained model for automatic hierarchical parsing of observationsequences. We describe two applications of our model and its parameterestimation procedure. In the first application we show how to constructhierarchical models of natural English text. In these models differentlevels of the hierarchy correspond to structures on different length scalesin the text. In the second application we demonstrate how HHMMs can be usedto automatically identify repeated strokes that represent combination ofletters in cursive handwriting.