A tutorial on learning with Bayesian networks
Learning in graphical models
Bayesian Error-Bars for Belief Net Inference
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Hierarchical Dirichlet model for document classification
ICML '05 Proceedings of the 22nd international conference on Machine learning
Bayesian Network Learning with Parameter Constraints
The Journal of Machine Learning Research
Hi-index | 0.00 |
A Bayesian belief network is a model of a joint distribution over a finite set of variables, with a DAG structure representing immediate dependencies among the variables. For each node, a table of parameters (CP-table) represents local conditional probabilities, with rows indexed by conditioning events (assignments to parents). CP-table rows are usually modeled as independent random vectors, each assigned a Dirichlet prior distribution. The assumption that rows are independent permits a relatively simple analysis but may not reflect actual prior opinion about the parameters. Rows representing similar conditioning events often have similar conditional probabilities. This paper introduces a more flexible family of "dependent Dirichlet" prior distributions, where rows are not necessarily independent. Simple methods are developed to approximate the Bayes estimators of CP-table parameters with optimal linear estimators; i.e., linear combinations of sample proportions and prior means. This approach yields more efficient estimators by sharing information among rows. Improvements in efficiency can be substantial when a CP-table has many rows and samples sizes are small.