Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
LAZY propagation: a junction tree inference algorithm based on lazy evaluation
Artificial Intelligence
Importance sampling in Bayesian networks using probability trees
Computational Statistics & Data Analysis
Context-specific independence in Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Recursive probability trees for Bayesian networks
CAEPIA'09 Proceedings of the Current topics in artificial intelligence, and 13th conference on Spanish association for artificial intelligence
Learning recursive probability trees from probabilistic potentials
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Bayesian networks are efficient tools for probabilistic reasoning over large sets of variables, due to the fact that the joint distribution factorises according to the structure of the network, which captures conditional independence relations among the variables. Beyond conditional independence, the concept of asymmetric (or context specific) independence makes possible the definition of even more efficient reasoning schemes, based on the representation of probability functions through probability trees. In this paper we investigate how it is possible to achieve a finer factorisation by decomposing the original factors for which some conditions hold. We also introduce the concept of approximate factorisation and apply this methodology to the Lazy-Penniless propagation algorithm.