Exploiting qualitative knowledge in the learning of conditional probabilities of Bayesian networks

  • Authors:
  • Frank Wittig;Anthony Jameson

  • Affiliations:
  • Department of Computer Science, University of Saarbrücken, Saarbrücken, Germany;Department of Computer Science, University of Saarbrücken, Saarbrücken, Germany

  • Venue:
  • UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
  • Year:
  • 2000

Quantified Score

Hi-index 0.01

Visualization

Abstract

Algorithms for learning the conditional probabilities of Bayesian networks with hidden variables typically operate within a high-dimensional search space and yield only locally optimal solutions. One way of limiting the search space and avoiding local optima is to impose qualitative constraints that are based on background knowledge concerning the domain. We present a method for integrating formal statements of qualitative constraints into two learning algorithms, APN and EM. In our experiments with synthetic data, this method yielded networks that satisfied the constraints almost perfectly. The accuracy of the learned networks was consistently superior to that of corresponding networks learned without constraints. The exploitation of qualitative constraints therefore appears to be a promising way to increase both the interpretability and the accuracy of learned Bayesian networks with known structure.