Exploring the state sequence space for hidden Markov and semi-Markov chains
Computational Statistics & Data Analysis
Computer
Efficient computation of entropy gradient for semi-supervised conditional random fields
NAACL-Short '07 Human Language Technologies 2007: The Conference of the North American Chapter of the Association for Computational Linguistics; Companion Volume, Short Papers
Products of weighted logic programs
Theory and Practice of Logic Programming
Bridge the gap between statistical and hand-crafted grammars
Computer Speech and Language
Hi-index | 754.84 |
Hidden Markov models (HMMs) are currently employed in a wide variety of applications, including speech recognition, target tracking, and protein sequence analysis. The Viterbi algorithm is perhaps the best known method for tracking the hidden states of a process from a sequence of observations. An important problem when tracking a process with an HMM is estimating the uncertainty present in the solution. In this correspondence, an algorithm for computing at runtime the entropy of the possible hidden state sequences that may have produced a certain sequence of observations is introduced. The brute-force computation of this quantity requires a number of calculations exponential in the length of the observation sequence. This algorithm, however, is based on a trellis structure resembling that of the Viterbi algorithm, and permits the efficient computation of the entropy with a complexity linear in the number of observations.