Graph-Based Algorithms for Boolean Function Manipulation
IEEE Transactions on Computers
Proceedings of the 36th annual ACM/IEEE Design Automation Conference
A differential approach to inference in Bayesian networks
Journal of the ACM (JACM)
On probabilistic inference by weighted model counting
Artificial Intelligence
Modeling and Reasoning with Bayesian Networks
Modeling and Reasoning with Bayesian Networks
Journal of Artificial Intelligence Research
Probabilistic Graphical Models: Principles and Techniques - Adaptive Computation and Machine Learning
Treewidth in verification: local vs. global
LPAR'05 Proceedings of the 12th international conference on Logic for Programming, Artificial Intelligence, and Reasoning
Using DPLL for efficient OBDD construction
SAT'04 Proceedings of the 7th international conference on Theory and Applications of Satisfiability Testing
SDD: a new canonical representation of propositional knowledge bases
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hi-index | 0.00 |
Knowledge compilation is a powerful approach to exact inference in probabilistic graphical models, which is able to effectively exploit determinism and context-specific independence, allowing it to scale to highly connected models that are otherwise infeasible using more traditional methods (based on treewidth alone). Previous approaches were based on performing two steps: encode a model into CNF, then compile the CNF into an equivalent but more tractable representation (d-DNNF), where exact inference reduces to weighted model counting. In this paper, we investigate a bottom-up approach, that is enabled by a recently proposed representation, the Sentential Decision Diagram (SDD). We describe a novel and efficient way to encode the factors of a given model directly to SDDs, bypassing the CNF representation. To compile a given model, it now suffices to conjoin the SDD representations of its factors, using an apply operator, which d-DNNFs lack. Empirically, we find that our simpler approach to knowledge compilation is as effective as those based on d-DNNFs, and at times, orders-of-magnitude faster.