Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Computational intelligence: a logical approach
Computational intelligence: a logical approach
Bucket elimination: a unifying framework for reasoning
Artificial Intelligence
Artificial Intelligence - special issue on computational tradeoffs under bounded resources
Any-Space Probabilistic Inference
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Query DAGs: a practical paradigm for implementing belief-network inference
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Methods for constructing balanced elimination trees and other recursive decompositions
International Journal of Approximate Reasoning
On the structure of elimination trees for Bayesian network inference
MICAI'10 Proceedings of the 9th Mexican international conference on Artificial intelligence conference on Advances in soft computing: Part II
Exploiting dynamic independence in a static conditioning graph
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Efficient indexing methods for recursive decompositions of Bayesian networks
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Programmers employing inference in Bayesian networks typically rely on the inclusion of the model as well as an inference engine into their application. Sophisticated inference engines require non-trivial amounts of space and are also difficult to implement. This limits their use in some applications that would otherwise benefit from probabilistic inference. This paper presents a system that minimizes the space requirement of the model. The inference engine is sufficiently simple as to avoid space-limitation and be easily implemented in almost any environment. We show a fast, compact indexing structure that is linear in the size of the network. The additional space required to compute over the model is linear in the number of variables in the network.