Fusion, propagation, and structuring in belief networks
Artificial Intelligence
Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
Estimation and Marginalization Using the Kikuchi Approximation Methods
Neural Computation
Loop Corrections for Approximate Inference on Factor Graphs
The Journal of Machine Learning Research
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
An edge deletion semantics for belief propagation and its practical impact on approximation quality
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Approximate inference on planar graphs using loop calculus and belief propagation
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Convexifying the Bethe free energy
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Context-specific independence in Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Bucket elimination: a unifying framework for probabilistic inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Constructing free-energy approximations and generalized belief propagation algorithms
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
Hi-index | 0.00 |
This paper proposes a new method, conditional probability table CPT decomposition, to analyze the independent and deterministic components of CPT. This method can be used to approximate and analyze Baysian networks. The decomposition of Bayesian networks is accomplished by representing CPTs as a linear combination of extreme CPTs, which forms a new framework to conduct inference. Based on this new framework, inference in Bayesian networks can be done by decomposing them into less connected and weighted subnetworks. We can achieve exact inference if the original network is decomposed into singly-connected subnetworks. Besides, approximate inference can be done by discarding the subnetworks with small weights or by a partial decomposition and application of belief propagation BP on the still multiply-connected subnetworks. Experiments show that the decomposition-based approximation outperforms BP in most cases.