Deterministic annealing EM algorithm
Neural Networks
Bayesian Approaches to Gaussian Mixture Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
A view of the EM algorithm that justifies incremental, sparse, and other variants
Proceedings of the NATO Advanced Study Institute on Learning in graphical models
A Fast and High Quality Multilevel Scheme for Partitioning Irregular Graphs
SIAM Journal on Scientific Computing
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Learning mixture models using a genetic version of the EM algorithm
Pattern Recognition Letters
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Global Optimization For Molecular Clusters Using A New Smoothing Approach
Journal of Global Optimization
Blobworld: Image Segmentation Using Expectation-Maximization and Its Application to Image Querying
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mixture model clustering for mixed data with missing information
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis
Efficient greedy learning of Gaussian mixture models
Neural Computation
A Two-Round Variant of EM for Gaussian Mixtures
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Data perturbation for escaping local maxima in learning
Eighteenth national conference on Artificial intelligence
Estimation of mixture models
An empirical study of smoothing techniques for language modeling
ACL '96 Proceedings of the 34th annual meeting on Association for Computational Linguistics
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Editorial: Advances in Mixture Models
Computational Statistics & Data Analysis
SMEM Algorithm for Mixture Models
Neural Computation
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
TRUST-TECH-Based Expectation Maximization for Learning Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Editorial: Second special issue on statistical algorithms and software
Computational Statistics & Data Analysis
Dealing with multiple local modalities in latent class profile analysis
Computational Statistics & Data Analysis
Hi-index | 0.03 |
The task of obtaining an optimal set of parameters to fit a mixture model has many applications in science and engineering domains and is a computationally challenging problem. A novel algorithm using a convolution based smoothing approach to construct a hierarchy (or family) of smoothed log-likelihood surfaces is proposed. This approach smooths the likelihood function and applies the EM algorithm to obtain a promising solution on the smoothed surface. Using the most promising solutions as initial guesses, the EM algorithm is applied again on the original likelihood. Though the results are demonstrated using only two levels, the method can potentially be applied to any number of levels in the hierarchy. A theoretical insight demonstrates that the smoothing approach indeed reduces the overall gradient of a modified version of the likelihood surface. This optimization procedure effectively eliminates extensive searching in non-promising regions of the parameter space. Results on some benchmark datasets demonstrate significant improvements of the proposed algorithm compared to other approaches. Empirical results on the reduction in the number of local maxima and improvements in the initialization procedures are provided.