Self-adaptive design of hidden Markov models

  • Authors:
  • Jie Li;Jiaxin Wang;Yannan Zhao;Zehong Yang

  • Affiliations:
  • Department of Computer Science and Technology, State Key Laboratory of Intelligent Technology and Systems, Tsinghua University, Beijing 100084, China;Department of Computer Science and Technology, State Key Laboratory of Intelligent Technology and Systems, Tsinghua University, Beijing 100084, China;Department of Computer Science and Technology, State Key Laboratory of Intelligent Technology and Systems, Tsinghua University, Beijing 100084, China;Department of Computer Science and Technology, State Key Laboratory of Intelligent Technology and Systems, Tsinghua University, Beijing 100084, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2004

Quantified Score

Hi-index 0.10

Visualization

Abstract

Hidden Markov models (HMMs) are stochastic models widely used in speech and image processing in recent years. The number of states in a classical HMMs is usually predefined and fixed during training, and may be quite different from the real number of hidden states of the signal source. Moreover, in pattern recognition applications, different signal sources probably have different state numbers, thereby cannot be well modeled by HMMs with a fixed state number. This paper proposes a self-adaptive design method of HMMs to overcome this limitation. According to this design, an HMM automatically matches its state number to the real state number of the signal source being modeled. To realize a practicable training algorithm for the new HMM, this paper first introduces an entropic definition of the a priori probability of the model and accordingly a maximum a posteriori (MAP) training strategy, and then designs an MAP training algorithm in the case of fixed state number based on the deterministic annealing (DA) technique. Based on this MAP training, a complete training method named shrink algorithm is finally proposed for the new HMM. Experimental results indicate that self-adaptive HMMs can model stochastic signals more accurately and have better performance in pattern recognition than classical models.