Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
Partial abductive inference in Bayesian belief networks using a genetic algorithm
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Artificial Intelligence - special issue on computational tradeoffs under bounded resources
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Any-Space Probabilistic Inference
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
A Differential Approach to Inference in Bayesian Networks
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Stochastic Greedy Search: Efficiently Computing a Most Probable Explanation in Bayesian Networks
Stochastic Greedy Search: Efficiently Computing a Most Probable Explanation in Bayesian Networks
Exploiting causal independence in Bayesian network inference
Journal of Artificial Intelligence Research
Bucket elimination: a unifying framework for probabilistic inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Finding Minimum Data Requirements Using Pseudo-independence
WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 02
Most Relevant Explanation: computational complexity and approximation methods
Annals of Mathematics and Artificial Intelligence
ECSQARU'13 Proceedings of the 12th European conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Hi-index | 0.00 |
MAP is the problem of finding a most probable instantiation of a set of variables in a Bayesian network, given (partial) evidence about the complement of that set. Unlike computing priors, posteriors, and MPE (a special case of MAP), the time and space complexity of MAP is not only exponential in the network treewidth, but also in a larger parameter known as the "constrained" treewidth. In practice, this means that computing MAP can be orders of magnitude more expensive than computing priors, posteriors or MPE. For this reason, MAP computations are generally avoided or approximated by practitioners. We have investigated the approximation of MAP using local search. The local search method has a space complexity which is exponential only in the network treewidth, as is the complexity of each step in the search process. Our experimental results show that local search provides a very good approximation of MAP, while requiring a small number of search steps. Practically, this means that the average case complexity of local search is often exponential only in treewidth as opposed to the constrained treewidth, making approximating MAP as efficient as other computations.