Convergence of an annealing algorithm
Mathematical Programming: Series A and B
Fundamentals of speech recognition
Fundamentals of speech recognition
Learning mixture models using a genetic version of the EM algorithm
Pattern Recognition Letters
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evolutionary algorithms for constrained parameter optimization problems
Evolutionary Computation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
Clustering in image space for place recognition and visualannotations for human-robot interaction
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Genetic Algorithms and Very Fast Simulated Reannealing: A comparison
Mathematical and Computer Modelling: An International Journal
Simulated annealing for maximum a posteriori parameter estimation of hidden Markov models
IEEE Transactions on Information Theory
Hierarchical speech-act classification for discourse analysis
Pattern Recognition Letters
Hi-index | 0.10 |
This paper attempts to overcome the local convergence problem of the Expectation Maximization (EM) based training of the Hidden Markov Model (HMM) in speech recognition. We propose a hybrid algorithm, Simulated Annealing Stochastic version of EM (SASEM), combining Simulated Annealing with EM that reformulates the HMM estimation process using a stochastic step between the EM steps and the SA. The stochastic processes of SASEM inside EM can prevent EM from converging to a local maximum and find improved estimation for HMM using the global convergence properties of SA. Experiments on the TIMIT speech corpus show that SASEM obtains higher recognition accuracies than the EM.