New advances in inference by recursive conditioning

  • Authors:
  • David Allen;Adnan Darwiche

  • Affiliations:
  • Computer Science Department, University of California, Los Angeles, CA;Computer Science Department, University of California, Los Angeles, CA

  • Venue:
  • UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recursive Conditioning (RC) was introduced recently as an any-space algorithm for inference in Bayesian networks which can trade time for space by varying the size of its cache at the increment needed to store a floating point number. Under full caching, RC has an asymptotic time and space complexity which is comparable to mainstream algorithms based on variable elimination and clustering (exponential in the network treewidth and linear in its size). We show two main results about RC in this paper. First, we show that its actual space requirements under full caching are much more modest than those needed by mainstream methods and study the implications of this finding. Second, we show that RC can effectively deal with determinism in Bayesian networks by employing standard logical techniques, such as unit resolution, allowing a significant reduction in its time requirements in certain cases. We illustrate our results using a number of benchmark networks, including the very challenging ones that arise in genetic linkage analysis.