Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Margin-based active learning for LVQ networks
Neurocomputing
A survey of kernel and spectral methods for clustering
Pattern Recognition
Improved Nyström low-rank approximation and error analysis
Proceedings of the 25th international conference on Machine learning
Data clustering: 50 years beyond K-means
Pattern Recognition Letters
Topographic mapping of large dissimilarity data sets
Neural Computation
Local matrix adaptation in topographic neural maps
Neurocomputing
Hi-index | 0.00 |
Clustering approaches constitute important methods for unsupervised data analysis. Traditionally, many clustering models focus on spherical or ellipsoidal clusters in Euclidean space. Kernel methods extend these approaches to more complex cluster forms, and they have been recently integrated into several clustering techniques. While leading to very flexible representations, kernel clustering has the drawback of high memory and time complexity due to its dependency on the full Gram matrix and its implicit representation of clusters in terms of feature vectors. In this contribution, we accelerate the kernelized Neural Gas algorithm by incorporating a Nyström approximation scheme and active learning, and we arrive at sparse solutions by integration of a sparsity constraint. We provide experimental results which show that these accelerations do not lead to a deterioration in accuracy while improving time and memory complexity.