Computer simulation methods: in theoretical physics
Computer simulation methods: in theoretical physics
Stochastic global optimization methods. part 1: clustering methods
Mathematical Programming: Series A and B
Stochastic global optimization methods. part 11: multi level methods
Mathematical Programming: Series A and B
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Computational Statistics & Data Analysis - Special issue on classification
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Bioinformatics: the machine learning approach
Bioinformatics: the machine learning approach
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Model selection for probabilistic clustering using cross-validatedlikelihood
Statistics and Computing
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
Efficient greedy learning of Gaussian mixture models
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A spatially constrained mixture model for image segmentation
IEEE Transactions on Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
High-performance dynamic quantum clustering on graphics processors
Journal of Computational Physics
Hi-index | 0.01 |
Given a data set, a dynamical procedure is applied to the data points in order to shrink and separate, possibly overlapping clusters. Namely, Newton's equations of motion are employed to concentrate the data points around their cluster centers, using an attractive potential, constructed specially for this purpose. During this process, important information is gathered concerning the spread of each cluster. In succession this information is used to create an objective function that maps each cluster to a local maximum. Global optimization is then used to retrieve the positions of the maxima that correspond to the locations of the cluster centers. Further refinement is achieved by applying the EM-algorithm to a Gaussian mixture model whose construction and initialization is based on the acquired information. To assess the effectiveness of our method, we have conducted experiments on a plethora of benchmark data sets. In addition we have compared its performance against four clustering techniques that are well established in the literature.