Topological parameters for time-space tradeoff
Artificial Intelligence
Artificial Intelligence - special issue on computational tradeoffs under bounded resources
Caching in the TSP Search Space
IEA/AIE '09 Proceedings of the 22nd International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems: Next-Generation Applied Intelligence
A join tree probability propagation architecture for semantic modeling
Journal of Intelligent Information Systems
Time space tradeoffs in GA based feature selection for workload characterization
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part II
New advances in inference by recursive conditioning
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Exploiting dynamic independence in a static conditioning graph
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Improving the performance of constructive multi-start search using record-keeping
IEA/AIE'12 Proceedings of the 25th international conference on Industrial Engineering and Other Applications of Applied Intelligent Systems: advanced research in applied artificial intelligence
Hi-index | 0.00 |
Recursive Conditioning, RC, is an any-space algorithm for exact inference in Bayesian networks, which can trade space for time in increments of the size of a floating point number. This smooth trade-off is possible by varying the algorithm's cache size. When RC is run with a constrained cache size, an important problem arises: Which specific results should be cached in order to minimize the running time of the algorithm? RC is driven by a structure known as a dtree, and many such dtrees exist for a given Bayesian network. In this paper, we examine the problem of searching for an optimal caching scheme for a given dtree, and present some optimal time-space tradeoff curves for given dtrees of several published Bayesian networks. We also compare these curves to the memory requirements of state-of-the-art algorithms based on join-trees. Our results show that the memory requirements of these networks can be significantly reduced with only a minimal cost in time, allowing for exact inference in situations previously impractical. They also show that probabilistic reasoning systems can be efficiently designed to run under varying amounts of memory.