On the problem of reversing relational inductive knowledge representation

  • Authors:
  • Nico Potyka;Christoph Beierle;Gabriele Kern-Isberner

  • Affiliations:
  • Dept. of Computer Science, FernUniversität in Hagen, Hagen, Germany;Dept. of Computer Science, FernUniversität in Hagen, Hagen, Germany;Dept. of Computer Science, TU Dortmund, Dortmund, Germany

  • Venue:
  • ECSQARU'13 Proceedings of the 12th European conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

By using the principle of maximum entropy incomplete probabilistic knowledge can be completed to a full joint distribution. This inductive knowledge representation method can be reversed to extract probabilistic rules from an empirical probability distribution. Based on this idea propositional learning approach has been developed. Recently, an extension to a relational language has been presented, where, however, a central aspect, finding and resolving algebraic equations needed for the solution, has been treated as a black box. Here, we investigate both problems in more detail. We explain how equations for relational knowledge bases can be resolved, and give a comprehensive example of computing a relational knowledge base from a probability distribution. Furthermore, we describe how propositional mechanisms for finding equations can be refined to focus on more interesting equations and to reduce the number of candidates.