Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
A data structure for dynamic trees
Journal of Computer and System Sciences
Bucket elimination: a unifying framework for probabilistic inference
Learning in graphical models
Using Recursive Decomposition to Construct Elimination Orders, Jointrees, and Dtrees
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Dynamizing static algorithms, with applications to dynamic trees and history independence
SODA '04 Proceedings of the fifteenth annual ACM-SIAM symposium on Discrete algorithms
A graphical model for protein secondary structure prediction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Incremental heuristic search in AI
AI Magazine
Self-adjusting computation
Choosing the Optimal Hidden Markov Model for Secondary-Structure Prediction
IEEE Intelligent Systems
Scalable Parallel Implementation of Bayesian Network to Junction Tree Conversion for Exact Inference
SBAC-PAD '06 Proceedings of the 18th International Symposium on Computer Architecture and High Performance Computing
Adaptive functional programming
ACM Transactions on Programming Languages and Systems (TOPLAS)
Dynamic Graph Cuts for Efficient Inference in Markov Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Parallel tree contraction and its application
SFCS '85 Proceedings of the 26th Annual Symposium on Foundations of Computer Science
Computer Vision and Image Understanding
Modeling and Reasoning with Bayesian Networks
Modeling and Reasoning with Bayesian Networks
CEAL: a C-based language for self-adjusting computation
Proceedings of the 2009 ACM SIGPLAN conference on Programming language design and implementation
An experimental analysis of self-adjusting computation
ACM Transactions on Programming Languages and Systems (TOPLAS)
Logarithmic-time updates and queries in probabilistic networks
Journal of Artificial Intelligence Research
Unifying tree decompositions for reasoning in graphical models
Artificial Intelligence
Free energy estimates of all-atom protein structures using generalized belief propagation
RECOMB'07 Proceedings of the 11th annual international conference on Research in computational molecular biology
Logarithmic time parallel Bayesian inference
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
MAP estimation via agreement on trees: message-passing and linear programming
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Many algorithms and applications involve repeatedly solving variations of the same inference problem, for example to introduce new evidence to the model or to change conditional dependencies. As the model is updated, the goal of adaptive inference is to take advantage of previously computed quantities to perform inference more rapidly than from scratch. In this paper, we present algorithms for adaptive exact inference on general graphs that can be used to efficiently compute marginals and update MAP configurations under arbitrary changes to the input factor graph and its associated elimination tree. After a linear time preprocessing step, our approach enables updates to the model and the computation of any marginal in time that is logarithmic in the size of the input model. Moreover, in contrast to max-product our approach can also be used to update MAP configurations in time that is roughly proportional to the number of updated entries, rather than the size of the input model. To evaluate the practical effectiveness of our algorithms, we implement and test them using synthetic data as well as for two real-world computational biology applications. Our experiments show that adaptive inference can achieve substantial speedups over performing complete inference as the model undergoes small changes over time.