Self-organizing maps
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Faithful Representations and Topographic Maps: From Distortion- to Information-Based Self-Organization
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Joint entropy maximization in kernel-based topographic maps
Neural Computation
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Soft learning vector quantization
Neural Computation
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Supervised Neural Gas with General Similarity Measure
Neural Processing Letters
Vector quantization using information theoretic concepts
Natural Computing: an international journal
Magnification Control in Self-Organizing Maps and Neural Gas
Neural Computation
Soft nearest prototype classification
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this article we extend the (recently published) unsupervised information theoretic vector quantization approach based on the Cauchy–Schwarz-divergence for matching data and prototype densities to supervised learning and classification. In particular, first we generalize the unsupervised method to more general metrics instead of the Euclidean, as it was used in the original algorithm. Thereafter, we extend the model to a supervised learning method resulting in a fuzzy classification algorithm. Thereby, we allow fuzzy labels for both, data and prototypes. Finally, we transfer the idea of relevance learning for metric adaptation known from learning vector quantization to the new approach.