Bayesian classification (AutoClass): theory and results
Advances in knowledge discovery and data mining
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Automated resolution of singularities for hypersurfaces
Journal of Symbolic Computation - Special issue on applications of the Gröbner basis method
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Asymptotic model selection for naive Bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
On the geometry of Bayesian graphical models with hidden variables
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Asymptotic model selection for directed networks with hidden variables*
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Algebraic statistics in model selection
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Effective dimensions of hierarchical latent class models
Journal of Artificial Intelligence Research
Effective dimensions of partially observed polytrees
International Journal of Approximate Reasoning
Hi-index | 0.00 |
We present two algorithms for analytic asymptotic evaluation of the marginal likelihood of data given a Bayesian network with hidden nodes. As shown by previous work, this evaluation is particularly hard because for these models asymptotic approximation of the marginal likelihood deviates from the standard BIC score. Our algorithms compute regular dimensionality drop for latent models and compute the non-standard approximation formulas for singular statistics for these models. The presented algorithms are implemented in Matlab and Maple and their usage is demonstrated on several examples.