Towards Combining Inductive Logic Programming with Bayesian Networks
ILP '01 Proceedings of the 11th International Conference on Inductive Logic Programming
Dependency Networks for Relational Data
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Online learning and exploiting relational models in reinforcement learning
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
A comparison of approaches for learning probability trees
ECML'05 Proceedings of the 16th European conference on Machine Learning
Logical bayesian networks and their relation to other probabilistic logical models
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Hi-index | 0.00 |
Recently, there has been an increasing interest in directed probabilistic logical models and a variety of languages for describing such models has been proposed. Although many authors provide high-level arguments to show that in principle models in their language can be learned from data, most of the proposed learning algorithms have not yet been studied in detail. We introduce an algorithm, generalized ordering-search, to learn both structure and conditional probability distributions (CPDs) of directed probabilistic logical models. The algorithm upgrades the ordering-search algorithm for Bayesian networks. We use relational probability trees as a representation for the CPDs. We present experiments on blocks world domains, a gene domain and the Cora dataset.