Vector quantization and signal compression
Vector quantization and signal compression
Stochastic algorithms for exploratory data analysis: data clustering and data visualization
Learning in graphical models
ACM Computing Surveys (CSUR)
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
Self-Organizing Maps
Statistical Mechanics of Learning
Statistical Mechanics of Learning
Performance analysis of LVQ algorithms: a statistical physics approach
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Dynamics and Generalization Ability of LVQ Algorithms
The Journal of Machine Learning Research
`Neural-gas' network for vector quantization and its application to time-series prediction
IEEE Transactions on Neural Networks
Phase transitions in vector quantization and neural gas
Neurocomputing
Statistical Mechanics of On-line Learning
Similarity-Based Clustering
Window-based example selection in learning vector quantization
Neural Computation
Hybrid model of clustering and kernel autoassociator for reliable vehicle type classification
Machine Vision and Applications
Hi-index | 0.02 |
Various alternatives have been developed to improve the winner-takes-all (WTA) mechanism in vector quantization, including the neural gas (NG). However, the behavior of these algorithms including their learning dynamics, robustness with respect to initialization, asymptotic results, etc. has only partially been studied in a rigorous mathematical analysis. The theory of on-line learning allows for an exact mathematical description of the training dynamics in model situations. We demonstrate using a system of three competing prototypes trained from a mixture of Gaussian clusters that the NG can improve convergence speed and achieves robustness to initial conditions. However, depending on the structure of the data, the NG does not always obtain the best asymptotic quantization error.