Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian parameter estimation via variational methods
Statistics and Computing
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
The Journal of Machine Learning Research
Convex Optimization
Probabilistic models of text and images
Probabilistic models of text and images
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
EEG signal classification using wavelet feature extraction and a mixture of expert model
Expert Systems with Applications: An International Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian Estimation of Beta Mixture Models with Variational Inference
IEEE Transactions on Pattern Analysis and Machine Intelligence
On an Extended Fisher Criterion for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
IEEE Transactions on Image Processing
Hi-index | 0.00 |
In Bayesian analysis of a statistical model, the predictive distribution is obtained by marginalizing over the parameters with their posterior distributions. Compared to the frequently used point estimate plug-in method, the predictive distribution leads to a more reliable result in calculating the predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately calculating the posterior distributions of the parameters in the DMM analytically. In this paper, we extend our previous study for the DMM and propose an algorithm to calculate the predictive distribution of the DMM with the local variational inference (LVI) method. The true predictive distribution of the DMM is analytically intractable. By considering the concave property of the multivariate inverse beta function, we introduce an upper-bound to the true predictive distribution. As the global minimum of this upper-bound exists, the problem is reduced to seek an approximation to the true predictive distribution. The approximated predictive distribution obtained by minimizing the upper-bound is analytically tractable, facilitating the computation of the predictive likelihood. With synthesized data and real data evaluations, the good performance of the proposed LVI based method is demonstrated by comparing with some conventionally used methods.