Unsupervised Optimal Fuzzy Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
Experiments with Random Projection
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm
Statistics and Computing
Simultaneous Feature Selection and Clustering Using Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
EURASIP Journal on Applied Signal Processing
A dynamic merge-or-split learning algorithm on gaussian mixture for automated model selection
IDEAL'05 Proceedings of the 6th international conference on Intelligent Data Engineering and Automated Learning
A Possibilistic Fuzzy c-Means Clustering Algorithm
IEEE Transactions on Fuzzy Systems
Survey of clustering algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper addresses the problem of estimating the correct number of components in a Gaussian mixture given a sample data set. In particular, an extension of Gaussian-means (G-means) and Projected Gaussian-means (PG-means) algorithms is proposed. All these methods are based on one-dimensional statistical hypothesis test. G-means and PG-means are wrapper algorithms of the k-means and Expectation-Maximization (EM) algorithms, respectively. Although G-means is a simple and fast algorithm, it does not perform well when clusters overlap since it is based on k-means. PG-means can handle overlapped clusters but requires more computation and sometimes fails to find the right number of clusters. In this paper, we propose an extension, called Extended Projected Gaussian means (XPG-means). XPG-means is a wrapper algorithm of Possibilistic Fuzzy C-means (PFCM) algorithm. XPG-means integrates the advantages of both algorithms while resolving some of the disadvantages involving overlapped clusters, noise, and computational complexity. More specifically, XPG-means handles overlapped clusters better than G-means because of the use of fuzzy clustering, handles noise better than both algorithms because it uses possibilitistic clustering. XPG-means is less computationally expensive than PG-means because it uses local hypothesis testing scheme used by G-means that is specific to Gaussians wherease PG-means uses a more general Kolmogorov-Smirnow test on Gaussian mixtures. In addition, XPG-means demonstrates less variance in estimating the number of components than either of the other algorithms.