Self-organizing maps
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Soft learning vector quantization
Neural Computation
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Bregman Divergences and the Self Organising Map
IDEAL '08 Proceedings of the 9th International Conference on Intelligent Data Engineering and Automated Learning
Distance learning in discriminative vector quantization
Neural Computation
Adaptive relevance matrices in learning vector quantization
Neural Computation
Regularization in matrix relevance learning
IEEE Transactions on Neural Networks
The mathematics of divergence based online learning in vector quantization
ANNPR'10 Proceedings of the 4th IAPR TC3 conference on Artificial Neural Networks in Pattern Recognition
Relevance learning in unsupervised vector quantization based on divergences
WSOM'11 Proceedings of the 8th international conference on Advances in self-organizing maps
Comparing multi-objective and threshold-moving ROC curve generation for a prototype-based classifier
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Hi-index | 0.01 |
We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study divergence based learning vector quantization (DLVQ). We derive cost function based DLVQ schemes for the family of @c@?divergences which includes the well-known Kullback-Leibler divergence and the so-called Cauchy-Schwarz divergence as special cases. The corresponding training schemes are applied to two different real world data sets. The first one, a benchmark data set (Wisconsin Breast Cancer) is available in the public domain. In the second problem, color histograms of leaf images are used to detect the presence of cassava mosaic disease in cassava plants. We compare the use of standard Euclidean distances with DLVQ for different parameter settings. We show that DLVQ can yield superior classification accuracies and Receiver Operating Characteristics.