Bayesian Approaches to Gaussian Mixture Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Theoretical Improvements in Algorithmic Efficiency for Network Flow Problems
Journal of the ACM (JACM)
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning mixture models using a genetic version of the EM algorithm
Pattern Recognition Letters
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Global Optimization
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
k-means++: the advantages of careful seeding
SODA '07 Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms
A genetic algorithm with gene rearrangement for K-means clustering
Pattern Recognition
Fractional particle swarm optimization in multidimensional search space
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Probabilistic relabelling strategies for the label switching problem in Bayesian mixture models
Statistics and Computing
Maximum Likelihood Estimation of Gaussian Mixture Models Using Particle Swarm Optimization
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
SAR image segmentation based on gabor filter bank and active contours
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Balance between diversity and relevance for image search results
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
A novel clustering algorithm based Gaussian mixture model for image segmentation
Proceedings of the 8th International Conference on Ubiquitous Information Management and Communication
Hi-index | 0.01 |
Gaussian mixture models (GMM), commonly used in pattern recognition and machine learning, provide a flexible probabilistic model for the data. The conventional expectation-maximization (EM) algorithm for the maximum likelihood estimation of the parameters of GMMs is very sensitive to initialization and easily gets trapped in local maxima. Stochastic search algorithms have been popular alternatives for global optimization but their uses for GMM estimation have been limited to constrained models using identity or diagonal covariance matrices. Our major contributions in this paper are twofold. First, we present a novel parametrization for arbitrary covariance matrices that allow independent updating of individual parameters while retaining validity of the resultant matrices. Second, we propose an effective parameter matching technique to mitigate the issues related with the existence of multiple candidate solutions that are equivalent under permutations of the GMM components. Experiments on synthetic and real data sets show that the proposed framework has a robust performance and achieves significantly higher likelihood values than the EM algorithm.