Importance sampling algorithms for Bayesian networks: Principles and performance

  • Authors:
  • Changhe Yuan;Marek J. Druzdzel

  • Affiliations:
  • Decision Systems Laboratory, Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA 15260, United States;Decision Systems Laboratory, School of Information Sciences and Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA 15260, United States

  • Venue:
  • Mathematical and Computer Modelling: An International Journal
  • Year:
  • 2006

Quantified Score

Hi-index 0.98

Visualization

Abstract

Precision achieved by stochastic sampling algorithms for Bayesian networks typically deteriorates in the face of extremely unlikely evidence. In addressing this problem, importance sampling algorithms seem to be most successful. We discuss the principles underlying the importance sampling algorithms in Bayesian networks. After that, we describe Evidence Pre-propagation Importance Sampling (EPIS-BN), an importance sampling algorithm that computes an importance function using two techniques: loopy belief propagation [K. Murphy, Y. Weiss, M. Jordan, Loopy belief propagation for approximate inference: An empirical study, in: Proceedings of the Fifteenth Annual Conference on Uncertainty in Artificial Intelligence, UAI-99, San Francisco, CA, Morgan Kaufmann Publishers, 1999, pp. 467-475; Y. Weiss, Correctness of local probability propagation in graphical models with loops, Neural Computation 12 (1) (2000) 1-41] and @e-cutoff heuristic [J. Cheng, M.J. Druzdzel, BN-AIS: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks, Journal of Artificial Intelligence Research 13 (2000) 155-188]. We tested the performance of EPIS-BN on three large real Bayesian networks and observed that on all three networks it outperforms AIS-BN [J. Cheng, M.J. Druzdzel, BN-AIS: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks, Journal of Artificial Intelligence Research 13 (2000) 155-188], the current state-of-the-art algorithm, while avoiding its costly learning stage. We also compared EPIS-BN Gibbs sampling and discuss the role of the @e-cutoff heuristic in importance sampling for Bayesian networks. networks.