An Introduction to Variational Methods for Graphical Models
Machine Learning
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
The Journal of Machine Learning Research
Inferring parameters and structure of latent variable models by variational bayes
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Editorial: Advances in Mixture Models
Computational Statistics & Data Analysis
Variational Bayes for estimating the parameters of a hidden Potts model
Statistics and Computing
Computers in Biology and Medicine
Online kernel density estimation for interactive learning
Image and Vision Computing
A new variational Bayesian algorithm with application to human mobility pattern modeling
Statistics and Computing
Initializing the EM algorithm in Gaussian mixture models with an unknown number of components
Computational Statistics & Data Analysis
Bayesian binary regression with exponential power link
Computational Statistics & Data Analysis
Hi-index | 0.03 |
Variational methods, which have become popular in the neural computing/machine learning literature, are applied to the Bayesian analysis of mixtures of Gaussian distributions. It is also shown how the deviance information criterion, (DIC), can be extended to these types of model by exploiting the use of variational approximations. The use of variational methods for model selection and the calculation of a DIC are illustrated with real and simulated data. The variational approach allows the simultaneous estimation of the component parameters and the model complexity. It is found that initial selection of a large number of components results in superfluous components being eliminated as the method converges to a solution. This corresponds to an automatic choice of model complexity. The appropriateness of this is reflected in the DIC values.