Learning directed relational models with recursive dependencies

  • Authors:
  • Oliver Schulte;Hassan Khosravi;Tong Man

  • Affiliations:
  • School of Computing Science, Simon Fraser University, Vancouver-Burnaby, B.C., Canada;School of Computing Science, Simon Fraser University, Vancouver-Burnaby, B.C., Canada;School of Computing Science, Simon Fraser University, Vancouver-Burnaby, B.C., Canada

  • Venue:
  • ILP'11 Proceedings of the 21st international conference on Inductive Logic Programming
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, there has been an increasing interest in generative relational models that represent probabilistic patterns over both links and attributes. A key characteristic of relational data is that the value of a predicate often depends on values of the same predicate for related entities. In this paper we present a new approach to learning directed relational models which utilizes two key concepts: a pseudo likelihood measure that is well defined for recursive dependencies, and the notion of stratification from logic programming. An issue for modelling recursive dependencies with Bayes nets are redundant edges that increase the complexity of learning. We propose a new normal form for 1st-order Bayes nets that removes the redundancy, and prove that assuming stratification, the normal form constraints involve no loss of modelling power. We incorporate these constraints in the learn-and-join algorithm of Khosravi et al., which is a state-of-the art structure learning algorithm that upgrades propositional Bayes net learners for relational data. Emprical evaluation compares our approach to learning recursive dependencies with undirected models (Markov Logic Networks). The Bayes net approach is orders of magnitude faster, and learns more recursive dependencies, which lead to more accurate predictions.