Top-down induction of first-order logical decision trees
Artificial Intelligence
Artificial Intelligence
A Machine Learning Approach to Building Domain-Specific Search Engines
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Learning relational probability trees
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Dependency Networks for Relational Data
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Machine Learning
Learning Bayesian Networks
Online learning and exploiting relational models in reinforcement learning
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Logical bayesian networks and their relation to other probabilistic logical models
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Generalized Ordering-Search for Learning Directed Probabilistic Logical Models
Inductive Logic Programming
Learning directed probabilistic logical models from relational data
AI Communications
Learning directed probabilistic logical models: ordering-search versus structure-search
Annals of Mathematics and Artificial Intelligence
ILP'09 Proceedings of the 19th international conference on Inductive logic programming
Learning directed relational models with recursive dependencies
ILP'11 Proceedings of the 21st international conference on Inductive Logic Programming
Hi-index | 0.00 |
Recently, there has been an increasing interest in directed probabilistic logical models and a variety of formalisms for describing such models has been proposed. Although many authors provide high-level arguments to show that in principle models in their formalism can be learned from data, most of the proposed learning algorithms have not yet been studied in detail. We introduce an algorithm, generalized ordering-search, to learn both structure and conditional probability distributions (CPDs) of directed probabilistic logical models. The algorithm is based on the ordering-search algorithm for Bayesian networks. We use relational probability trees as a representation for the CPDs. We present experiments on a genetics domain, blocks world domains and the Cora dataset.