Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Learning Ground CP-Logic Theories by Leveraging Bayesian Network Learning Techniques
Fundamenta Informaticae - Progress on Multi-Relational Data Mining
Lifted probabilistic inference with counting formulas
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Lifted first-order belief propagation
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
First-order probabilistic inference
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Lifted first-order probabilistic inference
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Lifted aggregation in directed first-order probabilistic models
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Constraint processing in lifted probabilistic inference
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Lifted variable elimination: decoupling the operators from the constraint language
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Efficient probabilistic inference is key to the success of statistical relational learning. One issue that increases the cost of inference is the presence of irrelevant random variables. The Bayes-ball algorithm can identify the requisite variables in a propositional Bayesian network and thus ignore irrelevant variables. This paper presents a lifted version of Bayes-ball, which works directly on the first-order level, and shows how this algorithm applies to (lifted) inference in directed first-order probabilistic models.