Kernel-based classification using quantum mechanics
Pattern Recognition
Bayesian Mixture Hierarchies for Automatic Image Annotation
ECIR '09 Proceedings of the 31th European Conference on IR Research on Advances in Information Retrieval
Variational Bayes Adapted GMM Based Models for Audio Clip Classification
PReMI '09 Proceedings of the 3rd International Conference on Pattern Recognition and Machine Intelligence
Learning Gaussian mixture models with entropy-based criteria
IEEE Transactions on Neural Networks
Kernel bandwidth estimation for nonparametric modeling
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A variational bayes approach to image segmentation
BVAI'07 Proceedings of the 2nd international conference on Advances in brain, vision and artificial intelligence
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
An extension of the standard mixture model for image segmentation
IEEE Transactions on Neural Networks
IEEE Transactions on Image Processing
Entropy-based variational scheme for fast bayes learning of Gaussian mixtures
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Fast Algorithm and Efficient Implementation of GMM-Based Pattern Classifiers
Journal of Signal Processing Systems
Analysis and optimization of the asynchronous modulated light detection pixel
Analog Integrated Circuits and Signal Processing
The infinite Student's t-mixture for robust modeling
Signal Processing
Blind source separation with time series variational Bayes expectation maximization algorithm
Digital Signal Processing
Variational conditional random fields for online speaker detection and tracking
Speech Communication
Translating related words to videos and back through latent topics
Proceedings of the sixth ACM international conference on Web search and data mining
Hi-index | 0.00 |
This paper proposes a joint maximum likelihood and Bayesian methodology for estimating Gaussian mixture models. In Bayesian inference, the distributions of parameters are modeled, characterized by hyperparameters. In the case of Gaussian mixtures, the distributions of parameters are considered as Gaussian for the mean, Wishart for the covariance, and Dirichlet for the mixing probability. The learning task consists of estimating the hyperparameters characterizing these distributions. The integration in the parameter space is decoupled using an unsupervised variational methodology entitled variational expectation-maximization (VEM). This paper introduces a hyperparameter initialization procedure for the training algorithm. In the first stage, distributions of parameters resulting from successive runs of the expectation-maximization algorithm are formed. Afterward, maximum-likelihood estimators are applied to find appropriate initial values for the hyperparameters. The proposed initialization provides faster convergence, more accurate hyperparameter estimates, and better generalization for the VEM training algorithm. The proposed methodology is applied in blind signal detection and in color image segmentation