The application of hidden Markov models in speech recognition
Foundations and Trends in Signal Processing
Discriminative face recognition
AVBPA'03 Proceedings of the 4th international conference on Audio- and video-based biometric person authentication
Large margin learning of Bayesian classifiers based on Gaussian mixture models
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Improved acoustic modeling with Bayesian learning
ICASSP'92 Proceedings of the 1992 IEEE international conference on Acoustics, speech and signal processing - Volume 1
High performance connected digit recognition using codebook exponents
ICASSP'92 Proceedings of the 1992 IEEE international conference on Acoustics, speech and signal processing - Volume 1
Hi-index | 0.00 |
Recently, Gopalakrishnan et al. (1989) introduced a reestimation formula for discrete HMMs (hidden Markov models) which applies to rational objective functions like the MMIE (maximum mutual information estimation) criterion. The authors analyze the formula and show how its convergence rate can be substantially improved. They introduce a corrective MMIE training algorithm, which, when applied to the TI/NIST connected digit database, has made it possible to reduce the string error rate by close to 50%. Gopalakrishnan's result is extended to the continuous case by proposing a new formula for estimating the mean and variance parameters of diagonal Gaussian densities.