Evidential reasoning using stochastic simulation of causal models
Artificial Intelligence
Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
Blocking Gibbs sampling in very large probabilistic expert systems
International Journal of Human-Computer Studies - Special issue: real-world applications of uncertain reasoning
An optimal approximation algorithm for Bayesian inference
Artificial Intelligence
Importance sampling in Bayesian networks using probability trees
Computational Statistics & Data Analysis
Simulation and the Monte Carlo Method
Simulation and the Monte Carlo Method
Simulation Approaches to General Probabilistic Inference on Belief Networks
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Weighing and Integrating Evidence for Stochastic Simulation in Bayesian Networks
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Journal of Artificial Intelligence Research
Exploiting causal independence in Bayesian network inference
Journal of Artificial Intelligence Research
Reduction of computational complexity in Bayesian networksthrough removal of weak dependences
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
International Journal of Approximate Reasoning
SampleSearch: Importance sampling in presence of determinism
Artificial Intelligence
Importance sampling on Bayesian networks with deterministic causalities
ECSQARU'11 Proceedings of the 11th European conference on Symbolic and quantitative approaches to reasoning with uncertainty
Importance sampling-based estimation over AND/OR search spaces for graphical models
Artificial Intelligence
Answering queries in hybrid Bayesian networks using importance sampling
Decision Support Systems
Learning recursive probability trees from probabilistic potentials
International Journal of Approximate Reasoning
Hi-index | 0.00 |
In this paper we introduce a new dynamic importance sampling propagation algorithm for Bayesian networks. Importance sampling is based on using an auxiliary sampling distribution from which a set of configurations of the variables in the network is drawn, and the performance of the algorithm depends on the variance of the weights associated with the simulated configurations. The basic idea of dynamic importance sampling is to use the simulation of a configuration to modify the sampling distribution in order to improve its quality and so reducing the variance of the future weights. The paper shows that this can be achieved with a low computational effort. The experiments carried out show that the final results can be very good even in the case that the initial sampling distribution is far away from the optimum.