On Compatible Priors for Bayesian Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning belief networks from data: an information theory based approach
CIKM '97 Proceedings of the sixth international conference on Information and knowledge management
A tutorial on learning with Bayesian networks
Learning in graphical models
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
A hybrid Bayesian network learning method for constructing gene networks
Computational Biology and Chemistry
Learning locally minimax optimal Bayesian networks
International Journal of Approximate Reasoning
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
Score-based methods for learning Markov boundaries by searching in constrained spaces
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
It is often stated in papers tackling the task of selecting a Bayesian network structure from data that there are these two distinct approaches: (i) Apply conditional independence tests when testing for the presence or otherwise of edges; (ii) Search the model space using a scoring metric. Here I argue that for complete data and a given node ordering this division is largely a myth, by showing that cross entropy methods for checking conditional independence are mathematically identical to methods based upon discriminating between models by their overall goodness-of-fit logarithmic scores.