Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Supporting Ranked Boolean Similarity Queries in MARS
IEEE Transactions on Knowledge and Data Engineering
A Maximum Variance Cluster Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
SMEM algorithm is not fully compatible with maximum-likelihood framework
Neural Computation
Computational Statistics & Data Analysis
Efficient greedy learning of Gaussian mixture models
Neural Computation
A local search approximation algorithm for k-means clustering
Computational Geometry: Theory and Applications - Special issue on the 18th annual symposium on computational geometrySoCG2002
Pattern Recognition Letters
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
SMEM Algorithm for Mixture Models
Neural Computation
Estimating local optimums in EM algorithm over Gaussian mixture model
Proceedings of the 25th international conference on Machine learning
Random swap EM algorithm for finite mixture models in image segmentation
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
RSEM: An Accelerated Algorithm on Repeated EM
ICIG '11 Proceedings of the 2011 Sixth International Conference on Image and Graphics
Hi-index | 0.10 |
Expectation maximization (EM) algorithm is a popular way to estimate the parameters of Gaussian mixture models. Unfortunately, its performance highly depends on the initialization. We propose a random swap EM for the initialization of EM. Instead of starting from a completely new solution in each repeat as in repeated EM, we make a random perturbation on the solution before continuing EM iterations. The removal and addition in random swap are simpler and more natural than split and merge or crossover and mutation operations. The most important benefit of random swap is its simplicity and efficiency. RSEM needs only the number of swaps as a parameter in contrast to complicated parameter-setting in genetic-based EM. We show by experiments that the proposed algorithm is 9-63% faster in computation time compared to the repeated EM, 20-83% faster than split and merge EM except in one case. RSEM is much faster but has lower log-likelihood than GAEM for synthetic data with a certain parameter setting. The proposed algorithm also reaches comparable result in terms of log-likelihood.