Learning in graphical models
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Mixture reduction via predictive scores
Statistics and Computing
A variational approximation for Bayesian networks with discrete and continuous latent variables
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Learning Bayesian networks from incomplete databases
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Asymptotic model selection for directed networks with hidden variables*
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Discrete mixtures in Bayesian networks with hidden variables: a latent time budget example
Computational Statistics & Data Analysis
On the robustness of Bayesian networks to learning from non-conjugate sampling
International Journal of Approximate Reasoning
Hi-index | 0.00 |
In this paper we demonstrate how Gröbner bases and other algebraic techniques can be used to explore the geometry of the probability space of Bayesian networks with hidden variables. These techniques employ a parametrisation of Bayesian network by moments rather than conditional probabilities. We show that whilst Gröbner bases help to explain the local geometry of these spaces a complimentary analysis, modelling the positivity of probabilities, enhances and completes the geometrical picture. We report some recent geometrical results in this area and discuss a possible general methodology for the analyses of such problems.