Model selection
Probabilistic similarity networks
Probabilistic similarity networks
Elements of information theory
Elements of information theory
Knowledge representation and inference in similarity networks and Bayesian multinets
Artificial Intelligence
Machine Learning - Special issue on learning with probabilistic representations
A Guide to the Literature on Learning Probabilistic Networks from Data
IEEE Transactions on Knowledge and Data Engineering
Speech recognition with dynamic bayesian networks
Speech recognition with dynamic bayesian networks
Natural statistical models for automatic speech recognition
Natural statistical models for automatic speech recognition
Buried Markov models for speech recognition
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 02
Discovering the hidden structure of complex dynamic systems
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Learning bayesian network structure from massive datasets: the «sparse candidate« algorithm
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
The Bayesian structural EM algorithm
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Learning the structure of dynamic probabilistic networks
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Learning Bayesian network classifiers by risk minimization
International Journal of Approximate Reasoning
Multistream recognition of dialogue acts in meetings
MLMI'06 Proceedings of the Third international conference on Machine Learning for Multimodal Interaction
Hi-index | 0.00 |
In this work, dynamic Bayesian multinets are introduced where a Markov chain state at time t determines conditional independence patterns between random variables lying within a local time window surrounding t. It is shown how information-theoretic criterion functions can be used to induce sparse, discriminative, and classconditional network structures that yield an optimal approximation to the class posterior probability, and therefore are useful for the classification task. Using a new structure learning heuristic, the resulting models are tested on a medium-vocabulary isolated-word speech recognition task. It is demonstrated that these discriminatively structured dynamic Bayesian multinets, when trained in a maximum likelihood setting using EM, can outperform both HMMs and other dynamic Bayesian networks with a similar number of parameters.