Journal of Complexity
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Asymptotic Model Selection for Naive Bayesian Networks
The Journal of Machine Learning Research
Algebraic Geometry and Statistical Learning Theory
Algebraic Geometry and Statistical Learning Theory
A widely applicable Bayesian information criterion
The Journal of Machine Learning Research
Hi-index | 0.00 |
The standard Bayesian Information Criterion (BIC) is derived under regularity conditions which are not always satisfied in the case of graphical models with hidden variables. In this paper we derive the BIC for the binary graphical tree models where all the inner nodes of a tree represent binary hidden variables. This provides an extension of a similar formula given by Rusakov and Geiger for naive Bayes models. The main tool used in this paper is the connection between the growth behavior of marginal likelihood integrals and the real log-canonical threshold.