Fundamentals of speech recognition
Fundamentals of speech recognition
Bayesian learning of probabilistic language models
Bayesian learning of probabilistic language models
Information Extraction with HMM Structures Learned by Stochastic Optimization
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Self-adaptive design of hidden Markov models
Pattern Recognition Letters
A method to design standard HMMs with desired length distribution for biological sequence analysis
WABI'06 Proceedings of the 6th international conference on Algorithms in Bioinformatics
Inducing hidden Markov models to model long-term dependencies
ECML'05 Proceedings of the 16th European conference on Machine Learning
Efficient Pruning of Probabilistic Automata
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Position Models and Language Modeling
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Hi-index | 0.01 |
We propose a novel approach to learn the structure of Partially Observable Markov Models (POMMs) and to estimate jointly their parameters. POMMs are graphical models equivalent to Hidden Markov Models (HMMs). The model structure is built to support the First Passage Times (FPT) dynamics observed in the training sample. We argue that the FPT in POMMs are closely related to the model structure. Starting from a standard Markov chain, states are iteratively added to the model. A novel algorithm POMMPHit is proposed to estimate the POMM transition probabilities to fit the sample FPT dynamics. The transitions with the lowest expected passage times are trimmed off from the model. Practical evaluations on artificially generated data and on DNA sequence modeling show the benefits over Bayesian model induction or EM estimation of ergodic models with transition trimming.