AND/OR search spaces for graphical models
Artificial Intelligence
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
On probabilistic inference by weighted model counting
Artificial Intelligence
Lifted probabilistic inference with counting formulas
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Lifted first-order belief propagation
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Journal of Artificial Intelligence Research
First-order probabilistic inference
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Lifted first-order probabilistic inference
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Compiling relational Bayesian networks for exact inference
International Journal of Approximate Reasoning
Probabilistic inductive logic programming: theory and applications
Probabilistic inductive logic programming: theory and applications
Constraint processing in lifted probabilistic inference
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Context-specific independence in Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Oblivious bounds on the probability of boolean functions
ACM Transactions on Database Systems (TODS)
Type Extension Trees for feature construction and learning in relational domains
Artificial Intelligence
Lifted variable elimination: decoupling the operators from the constraint language
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Probabilistic logical languages provide powerful formalisms for knowledge representation and learning. Yet performing inference in these languages is extremely costly, especially if it is done at the propositional level. Lifted inference algorithms, which avoid repeated computation by treating indistinguishable groups of objects as one, help mitigate this cost. Seeking inspiration from logical inference, where lifted inference (e.g., resolution) is commonly performed, we develop a model theoretic approach to probabilistic lifted inference. Our algorithm compiles a first-order probabilistic theory into a first-order deterministic decomposable negation normal form (d-DNNF) circuit. Compilation offers the advantage that inference is polynomial in the size of the circuit. Furthermore, by borrowing techniques from the knowledge compilation literature our algorithm effectively exploits the logical structure (e.g., context-specific independencies) within the first-order model, which allows more computation to be done at the lifted level. An empirical comparison demonstrates the utility of the proposed approach.