Techniques to incorporate the benefits of a hierarchy in a modified hidden Markov model

  • Authors:
  • Lin-Yi Chou

  • Affiliations:
  • University of Waikato, Hamilton, New Zealand

  • Venue:
  • COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper explores techniques to take advantage of the fundamental difference in structure between hidden Markov models (HMM) and hierarchical hidden Markov models (HHMM). The HHMM structure allows repeated parts of the model to be merged together. A merged model takes advantage of the recurring patterns within the hierarchy, and the clusters that exist in some sequences of observations, in order to increase the extraction accuracy. This paper also presents a new technique for reconstructing grammar rules automatically. This work builds on the idea of combining a phrase extraction method with HHMM to expose patterns within English text. The reconstruction is then used to simplify the complex structure of an HHMM The models discussed here are evaluated by applying them to natural language tasks based on CoNLL-2004 and a sub-corpus of the Lancaster Treebank