Augmenting a hidden Markov model for phrase-dependent word tagging
HLT '89 Proceedings of the workshop on Speech and Natural Language
Hi-index | 0.00 |
The paper presents a new algorithm for estimating the parameters of a hidden stochastic context-free grammar. In contrast to the Inside/Outside (I/O) algorithm it does not require the grammar to be expressed in Chomsky normal form, and thus can operate directly on more natural representations of a grammar. The algorithm uses a trellis-based structure as opposed to the binary branching tree structure used by the I/O algorithm. The form of the trellis is an extension of that used by the Forward/Backward (F/B) algorithm, and as a result the algorithm reduces to the latter for components that can be modelled as finite-state networks. In the same way that a hidden Markov model (HMM) is a stochastic analogue of a finite-state network, the representation used by the new algorithm is a stochastic analogue of a recursive transition network, in which a state may be simple or itself contain an underlying structure.