Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Search-based methods to bound diagnostic probabilities in very large belief nets
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
A Tractable Inference Algorithm for Diagnosing Multiple Diseases
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Sensitivities: an alternative to conditional probabilities for Bayesian belief networks
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Although probabilistic inference in a general Bayesian belief network is an NP-hard problem, computation time for inference can be reduced in most practical cases by exploiting domain knowledge and by making approximations in the knowledge representation. In this paper we introduce the property of similarity of states and a new method for approximate knowledge representation and inference which is based on this property. We define two or more states of a node to be similar when the ratio of their probabilities, the likelihood ratio, does not depend on the instantiations of the other nodes in the network. We show that the similarity of states exposes redundancies in the joint probability distribution which can be exploited to reduce the computation time of probabilistic inference in networks with multiple similar states, and that the computational complexity in the networks with exponentially many similar states might be polynomial. We demonstrate our ideas on the example of a BN20 network--a two layer network often used in diagnostic problems--by reducing it to a very close network with mnltiple similar states. We show that the answers to practical queries converge very fast to the answers obtained with the original network. The maximum error is as low as 5% for models that reqnire only 10% of the computation time needed by the original BN20 model.