Hidden Markov models, maximum mutual information estimation, and the speech recognition problem
Hidden Markov models, maximum mutual information estimation, and the speech recognition problem
Fundamentals of speech recognition
Fundamentals of speech recognition
The acoustic-modeling problem in automatic speech recognition
The acoustic-modeling problem in automatic speech recognition
A Maximum Likelihood Approach to Continuous Speech Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative learning for minimum error classification [patternrecognition]
IEEE Transactions on Signal Processing
Discriminative Estimation of Subspace Constrained Gaussian Mixture Models for Speech Recognition
IEEE Transactions on Audio, Speech, and Language Processing
An inequality for rational functions with applications to some statistical estimation problems
IEEE Transactions on Information Theory
Minimum classification error learning for sequential data in the wavelet domain
Pattern Recognition
International Journal of Speech Technology
Introducing the Discriminative Paraconsistent Machine (DPM)
Information Sciences: an International Journal
Hi-index | 0.10 |
Discriminative training for hidden Markov models (HMMs) has been a central theme in speech recognition research for many years. One most popular technique is minimum classification error (MCE) training, with the objective function closely related to the empirical error rate and with the optimization method based traditionally on gradient descent. In this paper, we provide a new look at the MCE technique in two ways. First, we develop a non-trivial framework in which the MCE objective function is re-formulated as a rational function for multiple sentence-level training tokens. Second, using this novel re-formulation, we develop a new optimization method for discriminatively estimating HMM parameters based on growth transformation or extended Baum-Welch algorithm. Technical details are given for the use of lattices as a rich representation of competing candidates for the MCE training.