Some Relations Among Stochastic Finite State Networks Used in Automatic Speech Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Computational Complexity of Approximating Distributions by Probabilistic Automata
Machine Learning - Computational learning theory
Inductive Inference: Theory and Methods
ACM Computing Surveys (CSUR)
Learning Stochastic Regular Grammars by Means of a State Merging Method
ICGI '94 Proceedings of the Second International Colloquium on Grammatical Inference and Applications
Inducing Probabilistic Grammars by Bayesian Model Merging
ICGI '94 Proceedings of the Second International Colloquium on Grammatical Inference and Applications
Hidden Markov and Independence Models with Patterns for Sequential BIST
VTS '00 Proceedings of the 18th IEEE VLSI Test Symposium
HIDDEN MARKOV MODELS IN COMPUTATIONAL BIOLOGY: APPLICATIONS TO PROTEIN MODELING
HIDDEN MARKOV MODELS IN COMPUTATIONAL BIOLOGY: APPLICATIONS TO PROTEIN MODELING
Polynomially Complete Fault Detection Problems
IEEE Transactions on Computers
Hi-index | 0.00 |
We present a new model, derived from classical Hidden Markov Models (HMMs), to learn sequences of large Boolean vectors. Our model - Hidden Markov Model with Patterns, or HMMP - differs from HMM by the fact that it uses patterns to define the emission probability distributions attached to the states. We also present an efficient state merging algorithm to learn this model from training vector sequences. This model and our algorithm are applied to learn Boolean vector sequences used to test integrated circuits. The learned HMMPs are used as test sequence generators. They achieve very high fault coverage, despite their reduced size, which demonstrates the effectiveness of our approach.