Assessing a Mixture Model for Clustering with the Integrated Completed Likelihood
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian Regularization for Normal Mixture Estimation and Model-Based Clustering
Journal of Classification
Inference for multivariate normal mixtures
Journal of Multivariate Analysis
A computational strategy for doubly smoothed MLE exemplified in the normal mixture model
Computational Statistics & Data Analysis
Root selection in normal mixture models
Computational Statistics & Data Analysis
Hi-index | 0.00 |
Gaussian mixtures are very flexible in representing the underlying structure in the data. However, the likelihood inference for Gaussian mixtures with unrestricted covariance matrices is theoretically and practically challenging because the likelihood function is unbounded and often has multiple local maximizers. As shown in the numerical studies of this paper, the presence of multiple local maximizers including spurious local maximizers affects the performances of model selection criteria used to choose the number of components. In this paper we propose a new type of likelihood-based estimator, a gradient-based k-deleted maximum likelihood estimator, for Gaussian mixture models. The proposed estimator is designed to avoid spurious local maximizers and choose a statistically desirable local maximizer in the presence of multiple local maximizers. We first prove the consistency of the proposed estimator and then examine, by a real-data example and simulation studies, the performance of the proposed method in the likelihood-based model selection criteria commonly used to assess the number of components in Gaussian mixture models.