On Compatible Priors for Bayesian Networks

  • Authors:
  • Robert G. Cowell

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1996

Quantified Score

Hi-index 0.14

Visualization

Abstract

Given a Bayesian network of discrete random variables with a hyper-Dirichlet prior, a method is proposed for assigning Dirichlet priors to the conditional probabilities of structurally different networks. It defines a distance measure between priors which is to be minimized for the assignment process. Intuitively one would expect that if two models' priors are to qualify as being 'close' in some sense, then their posteriors should also be nearby after an observation. However one does not know in advance what will be observed next. Thus we are led to propose an expectation of Kullback-Leibler distances over all possible next observations to define a measure of distance between priors. In conjunction with the additional assumptions of global and local independence of the parameters [15], a number of theorems emerge which are usually taken as reasonable assumptions in the Bayesian network literature. The method is compared to the 'expansion and contraction' algorithm of [14], and is also contrasted with the results obtained in [7] who employ the additional assumption of likelihood equivalence which is not made here. A simple example illustrates the technique.