Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Survey propagation: An algorithm for satisfiability
Random Structures & Algorithms
Loopy Belief Propagation: Convergence and Effects of Message Errors
The Journal of Machine Learning Research
Machine Learning
Information, Physics, and Computation
Information, Physics, and Computation
Lifted probabilistic inference with counting formulas
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Lifted first-order belief propagation
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Logarithmic-time updates and queries in probabilistic networks
Journal of Artificial Intelligence Research
From sampling to model counting
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Bisimulation-based approximate lifted inference
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Lifted message passing approaches can be extremely fast at computing approximate marginal probability distributions over single variables and neighboring ones in the underlying graphical model. They do, however, not prescribe a way to solve more complex inference tasks such as computing joint marginals for k-tuples of distant random variables or satisfying assignments of CNFs. A popular solution in these cases is the idea of turning the complex inference task into a sequence of simpler ones by selecting and clamping variables one at a time and running lifted message passing again after each selection. This naive solution, however, recomputes the lifted network in each step from scratch, therefore often canceling the benefits of lifted inference. We show how to avoid this by efficiently computing the lifted network for each conditioning directly from the one already known for the single node marginals. Our experiments show that significant efficiency gains are possible for lifted message passing guided decimation for SAT and sampling.