Theory refinement on Bayesian networks
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
A tutorial on learning with Bayesian networks
Learning in graphical models
Using Bayesian networks to analyze expression data
RECOMB '00 Proceedings of the fourth annual international conference on Computational molecular biology
A Guide to the Literature on Learning Probabilistic Networks from Data
IEEE Transactions on Knowledge and Data Engineering
Data analysis with bayesian networks: a bootstrap approach
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Learning Bayesian networks: a unification for discrete and Gaussian domains
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Structure Search and Stability Enhancement of Bayesian Networks
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Tree edit distance from information theory
GbRPR'03 Proceedings of the 4th IAPR international conference on Graph based representations in pattern recognition
Probabilistic graphical models in artificial intelligence
Applied Soft Computing
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
Exploring many-core design templates for FPGAs and ASICs
International Journal of Reconfigurable Computing - Special issue on Selected Papers from the International Conference on Reconfigurable Computing and FPGAs (ReConFig'10)
Artificial Intelligence in Medicine
Hi-index | 0.00 |
In many domains, we are interested in analyzing the structure of the underlying distribution, e.g., whether one variable is a direct parent of the other. Bayesian model-selection attempts to find the MAP model and use its structure to answer these questions. However, when the amount of available data is modest, there might be many models that have non-negligible posterior. Thus, we want compute the Bayesian posterior of a feature, i.e., the total posterior probability of all models that contain it. In this paper, we propose a new approach for this task. We first show how to efficiently compute a sum over the exponential number of networks that are consistent with a fixed ordering over network variables. This allows us to compute, for a given ordering, both the marginal probability of the data and the posterior of a feature. We then use this result as the basis for an algorithm that approximates the Bayesian posterior of a feature. Our approach uses an Markov Chain Monte Carlo (MCMC) method, but over orderings rather than over network structures. The space of orderings is much smaller and more regular than the space of structures, and has a smoother posterior "landscape". We present empirical results on synthetic and real-life datasets that compare our approach to full model averaging (when possible), to MCMC over network structures, and to a non-Bayesian bootstrap approach.