On-line learning in neural networks
On-line learning in neural networks
Self-Organizing Maps
Statistical Mechanics of Learning
Statistical Mechanics of Learning
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Soft learning vector quantization
Neural Computation
Gradient-Based Optimization of Hyperparameters
Neural Computation
Performance analysis of LVQ algorithms: a statistical physics approach
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Dynamics and Generalization Ability of LVQ Algorithms
The Journal of Machine Learning Research
Border-sensitive learning in kernelized learning vector quantization
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.00 |
A variety of modifications have been employed to learning vector quantization (LVQ) algorithms using either crisp or soft windows for selection of data. Although these schemes have been shown in practice to improve performance, a theoretical study on the influence of windows has so far been limited. Here we rigorously analyze the influence of windows in a controlled environment of gaussian mixtures in high dimensions. Concepts from statistical physics and the theory of online learning allow an exact description of the training dynamics, yielding typical learning curves, convergence properties, and achievable generalization abilities. We compare the performance and demonstrate the advantages of various algorithms, including LVQ 2.1, generalized LVQ (GLVQ), Learning from Mistakes (LFM) and Robust Soft LVQ (RSLVQ). We find that the selection of the window parameter highly influences the learning curves but not, surprisingly, the asymptotic performances of LVQ 2.1 and RSLVQ. Although the prototypes of LVQ 2.1 exhibit divergent behavior, the resulting decision boundary coincides with the optimal decision boundary, thus yielding optimal generalization ability.