A global optimization technique for statistical classifier design
IEEE Transactions on Signal Processing
Supervised Neural Gas with General Similarity Measure
Neural Processing Letters
On the Generalization Ability of GRLVQ Networks
Neural Processing Letters
Learning Vector Quantization with Training Data Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Performance analysis of LVQ algorithms: a statistical physics approach
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Dynamics and Generalization Ability of LVQ Algorithms
The Journal of Machine Learning Research
Fuzzy Gaussian Process Classification Model
ICIAR '09 Proceedings of the 6th International Conference on Image Analysis and Recognition
Distance learning in discriminative vector quantization
Neural Computation
Learning vector quantization with adaptive prototype addition and removal
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Adaptive relevance matrices in learning vector quantization
Neural Computation
Median fuzzy c-means for clustering dissimilarity data
Neurocomputing
Regularized margin-based conditional log-likelihood loss for prototype learning
Pattern Recognition
Regularization in matrix relevance learning
IEEE Transactions on Neural Networks
Window-based example selection in learning vector quantization
Neural Computation
Generalized derivative based kernelized learning vector quantization
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
Relevance learning in generative topographic mapping
Neurocomputing
Generalized learning graph quantization
GbRPR'11 Proceedings of the 8th international conference on Graph-based representations in pattern recognition
Supervised Learning Probabilistic Neural Networks
Neural Processing Letters
Prototype-based classification of dissimilarity data
IDA'11 Proceedings of the 10th international conference on Advances in intelligent data analysis X
Prototype based classification using information theoretic learning
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
A study of the robustness of KNN classifiers trained using soft labels
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Relational extensions of learning vector quantization
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
White box classification of dissimilarity data
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part I
Patch processing for relational learning vector quantization
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Kernel robust soft learning vector quantization
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Multi-prototype label ranking with novel pairwise-to-total-rank aggregation
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Learning vector quantization for (dis-)similarities
Neurocomputing
Hi-index | 0.00 |
Learning vector quantization (LVQ) is a popular class of adaptive nearest prototype classifiers for multiclass classification, but learning algorithms from this family have so far been proposed on heuristic grounds. Here, we take a more principled approach and derive two variants of LVQ using a gaussian mixture ansatz. We propose an objective function based on a likelihood ratio and derive a learning rule using gradient descent. The new approach provides a way to extend the algorithms of the LVQ family to different distance measure and allows for the design of "soft" LVQ algorithms. Benchmark results show that the new methods lead to better classification performance than LVQ 2.1. An additional benefit of the new method is that model assumptions are made explicit, so that the method can be adapted more easily to different kinds of problems.