LAZY propagation: a junction tree inference algorithm based on lazy evaluation
Artificial Intelligence
Organizing matrices and matrix operations for paged memory systems
Communications of the ACM
Bayesian Networks and Decision Graphs
Bayesian Networks and Decision Graphs
Expert Systems and Probabiistic Network Models
Expert Systems and Probabiistic Network Models
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Optimal decomposition of belief networks
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
Axioms for probability and belief-function proagation
UAI '88 Proceedings of the Fourth Annual Conference on Uncertainty in Artificial Intelligence
Any-space probabilistic inference
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Bucket elimination: a unifying framework for probabilistic inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Approximate inference in Bayesian networks using binary probability trees
International Journal of Approximate Reasoning
Efficient indexing methods for recursive decompositions of Bayesian networks
International Journal of Approximate Reasoning
Hi-index | 0.04 |
A potential is a function that maps each configuration of a set of variables onto a real number. In the context of probabilistic graphical models, every family of probability distributions and every utility function is a potential, and the process of inference gives rise to new potentials. In principle, potentials defined on discrete variables might be represented as multidimensional arrays, but in practice they are implemented as linear arrays. In this paper we prove that in case of large potentials, the cost of retrieving their elements is significantly higher than the cost of multiplying, maximizing, or summing them. For this reason, we present an alternative algorithm that sequentially retrieves the elements of a potential implemented as a linear array without having to multiply the coordinates of each configuration by the offsets. We analyze theoretically and empirically the computational savings of this algorithm when applied to potential operations, such as marginalization, addition, multiplication, division, and conditioning. We also discuss the savings that can be obtained by multiplying several potentials at the same time, and by integrating the multiplication and marginalization of potentials.