Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
GTM: the generative topographic mapping
Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Hierarchical GTM: Constructing Localized Nonlinear Projection Manifolds in a Principled Way
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
Self-Organizing Maps
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
The Kernel-Adatron Algorithm: A Fast and Simple Learning Procedure for Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Two topographic maps for data visualisation
Data Mining and Knowledge Discovery
Quantization errors in the harmonic topographic mapping
SIP'06 Proceedings of the 5th WSEAS international conference on Signal processing
A family of novel clustering algorithms
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Mercer kernel-based clustering in feature space
IEEE Transactions on Neural Networks
A novel construction of connectivity graphs for clustering and visualization
WSEAS Transactions on Computers
Clustering large data with uncertainty
Applied Soft Computing
FHC: The fuzzy hyper-prototype clustering algorithm
International Journal of Knowledge-based and Intelligent Engineering Systems - Intelligent Information Processing: Techniques and Applications
Hi-index | 0.00 |
We discuss one of the shortcomings of the standard K-means algorithm - its tendency to converge to a local rather than a global optimum. This is often accommodated by means of different random restarts of the algorithm, however in this paper, we attack the problem by amending the performance function of the algorithm in such a way as to incorporate global information into the performance function. We do this in three different manners and show on artificial data sets that the resulting algorithms are less initialisation-dependent than the standard K-means algorithm. We also show how to create a family of topology-preserving manifolds using these algorithms and an underlying constraint on the positioning of the prototypes.