Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
Simulation and the Monte Carlo Method
Simulation and the Monte Carlo Method
Simulation Approaches to General Probabilistic Inference on Belief Networks
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Weighing and Integrating Evidence for Stochastic Simulation in Bayesian Networks
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Adaptive Importance Sampling for Estimation in Structured Domains
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Monte Carlo Strategies in Scientific Computing
Monte Carlo Strategies in Scientific Computing
Journal of Artificial Intelligence Research
Some properties of joint probability distributions
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
Backward simulation in Bayesian networks
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
Importance sampling algorithms for Bayesian networks: Principles and performance
Mathematical and Computer Modelling: An International Journal
International Journal of Approximate Reasoning
Answering queries in hybrid Bayesian networks using importance sampling
Decision Support Systems
Hi-index | 0.00 |
The AIS-BN algorithm [J. Cheng, M.J. Druzdzel, BN-AIS: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks, Journal of Artificial Intelligence Research 13 (2000) 155-188] is a successful importance sampling-based algorithm for Bayesian networks that relies on two heuristic methods to obtain an initial importance function: @e-cutoff, replacing small probabilities in the conditional probability tables by a larger @e, and setting the probability distributions of the parents of evidence nodes to uniform. However, why the simple heuristics are so effective was not well understood. In this paper, we point out that it is due to a practical requirement for the importance function, which says that a good importance function should possess thicker tails than the actual posterior probability distribution. By studying the basic assumptions behind importance sampling and the properties of importance sampling in Bayesian networks, we develop several theoretical insights into the desirability of thick tails for importance functions. These insights not only shed light on the success of the two heuristics of AIS-BN, but also provide a common theoretical basis for several other successful heuristic methods.