On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
An analysis of first-order logics of probability
Artificial Intelligence
Answering queries from context-sensitive probabilistic knowledge bases
Selected papers from the international workshop on Uncertainty in databases and deductive systems
Learning Probabilistic Relational Models
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Learning the structure of Markov logic networks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Machine Learning
Learning probabilities for noisy first-order rules
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Hi-index | 0.00 |
Statistical Relational Learning has received much attention this last decade. In the ILP community, several models have emerged for modelling and learning uncertain knowledge, expressed in subset of first order logics. Nevertheless, no deep comparisons have been made among them and, given an application, determining which model must be chosen is difficult. In this paper, we compare two of them, namely Markov Logic Networks and Bayesian Programs, especially with respect to their representation ability and inference methods. The comparison shows that the two models are substantially different, from the point of view of the user, so that choosing one really means choosing a different philosophy to look at the problem. In order to make the comparison more concrete, we have used a running example, which shows most of the interesting points of the approaches, yet remaining exactly tractable.