A stochastic version of Expectation Maximization algorithm for better estimation of Hidden Markov Model

  • Authors:
  • Shamsul Huda;John Yearwood;Roberto Togneri

  • Affiliations:
  • Center for Informatics and Applied Optimization, School of Information Technology and Mathematical Science (ITMS), University of Ballarat, VIC 3353, Victoria, Australia;Center for Informatics and Applied Optimization, School of Information Technology and Mathematical Science (ITMS), University of Ballarat, VIC 3353, Victoria, Australia;Center for Intelligent Information Processing, School of Electrical, Electronic and Computer Engineering, University of Western Australia, WA, Australia

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper attempts to overcome the local convergence problem of the Expectation Maximization (EM) based training of the Hidden Markov Model (HMM) in speech recognition. We propose a hybrid algorithm, Simulated Annealing Stochastic version of EM (SASEM), combining Simulated Annealing with EM that reformulates the HMM estimation process using a stochastic step between the EM steps and the SA. The stochastic processes of SASEM inside EM can prevent EM from converging to a local maximum and find improved estimation for HMM using the global convergence properties of SA. Experiments on the TIMIT speech corpus show that SASEM obtains higher recognition accuracies than the EM.