Deterministic annealing EM algorithm
Neural Networks
SMEM algorithm is not fully compatible with maximum-likelihood framework
Neural Computation
Computational Statistics & Data Analysis
Efficient greedy learning of Gaussian mixture models
Neural Computation
X-means: Extending K-means with Efficient Estimation of the Number of Clusters
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
SMEM Algorithm for Mixture Models
Neural Computation
Clustering by analytic functions
Information Sciences: an International Journal
Random swap EM algorithm for Gaussian mixture models
Pattern Recognition Letters
Hi-index | 0.00 |
The Expectation-Maximization (EM) algorithm is a popular tool in estimating model parameters, especially mixture models. As the EM algorithm is a hill-climbing approach, problems such as local maxima, plateau and ridges may appear. In the case of mixture models, these problems involve the initialization of the algorithm and the structure of the data set. We propose a random swap EM algorithm (RSEM) to overcome these problems in Gaussian mixture models. Random swaps are repeatedly performed in our method, which can break the configuration of the local maxima and other problems. Compared to the strategies in other methods, the proposed algorithm has relative improvements on log-likelihood value in most cases and less variance than other algorithms. We also apply RSEM to the image segmentation problem.