Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Fusion and propagation with multiple observations in belief networks
Artificial Intelligence
Local conditioning in Bayesian networks
Artificial Intelligence
LAZY propagation: a junction tree inference algorithm based on lazy evaluation
Artificial Intelligence
Probabilistic Expert Systems
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Hybrid Propagation in Junction Trees
IPMU'94 Selected papers from the 5th International Conference on Processing and Management of Uncertainty in Knowledge-Based Systems, Advances in Intelligent Computing
Exploiting contextual independence in probabilistic inference
Journal of Artificial Intelligence Research
A hybrid algorithm to compute marginal and joint beliefs in Bayesian networks and its complexity
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
HUGS: combining exact inference and Gibbs sampling in junction trees
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Query DAGs: a practical paradigm for implementing belief-network inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Bucket elimination: a unifying framework for probabilistic inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
New advances in inference by recursive conditioning
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Efficient inference in large discrete domains
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Hi-index | 0.00 |
Message-passing inference algorithms for Bayes nets can be broadly divided into two classes: i) clustering algorithms, like Lazy Propagation, Jensen's or Shafer-Shenoy's schemes, that work on secondary undirected trees; and ii) conditioning methods, like Pearl's, that use directly Bayes nets. It is commonly thought that algorithms of the former class always outperform those of the latter because Pearl's-like methods act as particular cases of clustering algorithms. In this paper, a new variant of Pearl's method based on a secondary directed graph is introduced, and it is shown that the computations performed by Shafer-Shenoy or Lazy propagation can be precisely reproduced by this new variant, thus proving that directed algorithms can be as efficient as undirected ones.