Extending Markov Logic to Model Probability Distributions in Relational Domains

  • Authors:
  • Dominik Jain;Bernhard Kirchlechner;Michael Beetz

  • Affiliations:
  • Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München,;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München,;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München,

  • Venue:
  • KI '07 Proceedings of the 30th annual German conference on Advances in Artificial Intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Markov logic, as a highly expressive representation formalism that essentially combines the semantics of probabilistic graphical models with the full power of first-order logic, is one of the most intriguing representations in the field of probabilistic logical modelling. However, as we will show, models in Markov logic often fail to generalize because the parameters they contain are highly domain-specific. We take the perspective of generative stochastic processes in order to describe probability distributions in relational domains and illustrate the problem in this context by means of simple examples.We propose an extension of the language that involves the specification of a priori independent attributes and that furthermore introduces a dynamic parameter adjustment whenever a model in Markov logic is instantiated for a certain domain (set of objects). Our extension removes the corresponding restrictions on processes for which models can be learned using standard methods and thus enables Markov logic networks to be practically applied to a far greater class of generative stochastic processes.