Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Blocking Gibbs sampling in very large probabilistic expert systems
International Journal of Human-Computer Studies - Special issue: real-world applications of uncertain reasoning
Introduction to Monte Carlo methods
Proceedings of the NATO Advanced Study Institute on Learning in graphical models
Bucket elimination: a unifying framework for reasoning
Artificial Intelligence
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Cycle-cutset sampling for Bayesian networks
AI'03 Proceedings of the 16th Canadian society for computational studies of intelligence conference on Advances in artificial intelligence
Random algorithms for the loop cutset problem
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
A general algorithm for approximate inference and its application to hybrid bayes nets
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
HUGS: combining exact inference and Gibbs sampling in junction trees
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
An anytime scheme for bounding posterior beliefs
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Approximate inference in probabilistic graphical models with determinism
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Cutset sampling for Bayesian networks
Journal of Artificial Intelligence Research
Active tuples-based scheme for bounding posterior beliefs
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
The paper studies empirically the time-space trade-off between sampling and inference in the cutser sampling algorithm. The algorithm samples over a subset of nodes in a Bayesian network and applies exact inference over the rest. As the size of the sampling space decreases, requiring less samples for convergence, the time for generating each single sample increases. Algorithm w-cutset sampling selects a sampling set such that the induced-width of the network when the sampling set is observed is bounded by w, thus requiring inference whose complexity is exponentially bounded by w. In this paper, we investigate the performance of w-cutset sampling as a function of w. Our experiments over a range of randomly generated and real benchmarks, demonstrate the power of the cutset sampling idea and in particular show that an optimal balance between inference and sampling benefits substantially from restricting the cutset size, even at the cost of more complex inference.