Mixtures of probabilistic principal component analyzers
Neural Computation
Akaike's information criterion and recent developments in information complexity
Journal of Mathematical Psychology
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Penalized Model-Based Clustering with Application to Variable Selection
The Journal of Machine Learning Research
Bayesian Regularization for Normal Mixture Estimation and Model-Based Clustering
Journal of Classification
Parsimonious Gaussian mixture models
Statistics and Computing
Computational Statistics & Data Analysis
Hi-index | 0.00 |
The efficacy of family-based approaches to mixture model-based clustering and classification depends on the selection of parsimonious models. Current wisdom suggests the Bayesian information criterion (BIC) for mixture model selection. However, the BIC has well-known limitations, including a tendency to overestimate the number of components as well as a proclivity for underestimating, often drastically, the number of components in higher dimensions. While the former problem might be soluble by merging components, the latter is impossible to mitigate in clustering and classification applications. In this paper, a LASSO-penalized BIC (LPBIC) is introduced to overcome this problem. This approach is illustrated based on applications of extensions of mixtures of factor analyzers, where the LPBIC is used to select both the number of components and the number of latent factors. The LPBIC is shown to match or outperform the BIC in several situations.