Elements of information theory
Elements of information theory
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Approximating posterior distributions in belief networks using mixtures
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Hierarchical Latent Class Models for Cluster Analysis
The Journal of Machine Learning Research
Naive Bayes models for probability estimation
ICML '05 Proceedings of the 22nd international conference on Machine learning
Mean field theory for sigmoid belief networks
Journal of Artificial Intelligence Research
Loopy belief propagation for approximate inference: an empirical study
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
We propose a novel method for approximate inference in Bayesian networks (BNs). The idea is to sample data from a BN, learn a latent tree model (LTM) from the data offline, and when online, make inference with the LTM instead of the original BN. Because LTMs are tree-structured, inference takes linear time. In the meantime, they can represent complex relationship among leaf nodes and hence the approximation accuracy is often good. Empirical evidence shows that our method can achieve good approximation accuracy at low online computational cost.