Dependent Dirichlet priors and optimal linear estimators for belief net parameters

  • Authors:
  • Peter M. Hooper

  • Affiliations:
  • University of Alberta, Edmonton, Canada

  • Venue:
  • UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Bayesian belief network is a model of a joint distribution over a finite set of variables, with a DAG structure representing immediate dependencies among the variables. For each node, a table of parameters (CP-table) represents local conditional probabilities, with rows indexed by conditioning events (assignments to parents). CP-table rows are usually modeled as independent random vectors, each assigned a Dirichlet prior distribution. The assumption that rows are independent permits a relatively simple analysis but may not reflect actual prior opinion about the parameters. Rows representing similar conditioning events often have similar conditional probabilities. This paper introduces a more flexible family of "dependent Dirichlet" prior distributions, where rows are not necessarily independent. Simple methods are developed to approximate the Bayes estimators of CP-table parameters with optimal linear estimators; i.e., linear combinations of sample proportions and prior means. This approach yields more efficient estimators by sharing information among rows. Improvements in efficiency can be substantial when a CP-table has many rows and samples sizes are small.