On convergence of the EM algorithmand the Gibbs sampler
Statistics and Computing
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
Probability in the Engineering and Informational Sciences
Bayesian density estimation using skew student-t-normal mixtures
Computational Statistics & Data Analysis
Interval estimation in a finite mixture model: Modeling P-values in multiple testing applications
Computational Statistics & Data Analysis
Robust mixture modeling based on scale mixtures of skew-normal distributions
Computational Statistics & Data Analysis
Multivariate mixture modeling using skew-normal independent distributions
Computational Statistics & Data Analysis
Hi-index | 0.00 |
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We focus on problems in estimation arising from the likelihood function having a sharp ridge or saddle points. We use both synthetic and empirical data with those features. The comparison includes Bayesian approaches with different prior specifications and various procedures to deal with label switching. Although the solutions provided by these stochastic algorithms are more often degenerate, we conclude that SEM and MCMC may display faster convergence and improve the ability to locate the global maximum of the likelihood function.