Self-Organizing Maps
How to make large self-organizing maps for nonvectorial data
Neural Networks - New developments in self-organizing maps
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Soft learning vector quantization
Neural Computation
Kernel Neural Gas Algorithms with Application to Cluster Analysis
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
A Novel Kernel Prototype-Based Learning Algorithm
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
Universal Approximation Capability of Cascade Correlation for Structures
Neural Computation
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
The Dissimilarity Representation for Pattern Recognition: Foundations And Applications (Machine Perception and Artificial Intelligence)
Dynamics and Generalization Ability of LVQ Algorithms
The Journal of Machine Learning Research
Similarity-based Classification: Concepts and Algorithms
The Journal of Machine Learning Research
Distance learning in discriminative vector quantization
Neural Computation
Computational capabilities of graph neural networks
IEEE Transactions on Neural Networks
Topographic mapping of large dissimilarity data sets
Neural Computation
Consistency of functional learning methods based on derivatives
Pattern Recognition Letters
Approximation techniques for clustering dissimilarity data
Neurocomputing
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Prototype-based classification schemes offer very intuitive and flexible classifiers with the benefit of easy interpretability of the results and scalability of the model complexity. Recent prototype-based models such as robust soft learning vector quantization (RSLVQ) have the benefit of a solid mathematical foundation of the learning rule and decision boundaries in terms of probabilistic models and corresponding likelihood optimization. In its original form, they can be used for standard Euclidean vectors only. In this contribution, we extend RSLVQ towards a kernelized version which can be used for any positive semidefinite data matrix. We demonstrate the superior performance of the technique, kernel RSLVQ, in a variety of benchmarks where results competitive or even superior to state-of-the-art support vector machines are obtained.