On the stationary state of Kohonen's self-organizing sensory mapping
Biological Cybernetics
Competitive learning algorithms for vector quantization
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Neural computation and self-organizing maps: an introduction
Neural computation and self-organizing maps: an introduction
Topology representing networks
Neural Networks
Intrinsic Dimensionality Estimation With Optimally Topology Preserving Maps
IEEE Transactions on Pattern Analysis and Machine Intelligence
GTM: the generative topographic mapping
Neural Computation
Comparison of SOM point densities based on different criteria
Neural Computation
Neural maps and topographic vector quantization
Neural Networks
Faithful Representations and Topographic Maps: From Distortion- to Information-Based Self-Organization
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Intrinsic Dimension Estimation of Data: An Approach Based on Grassberger–Procaccia's Algorithm
Neural Processing Letters
Asymptotic Level Density of the Elastic Net Self-Organizing Feature Map
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Vector Quantization by Optimal Neural Gas
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Neural maps in remote sensing image analysis
Neural Networks - 2003 Special issue: Neural network analysis of complex scientific data: Astronomy and geosciences
Winner-Relaxing Self-Organizing Maps
Neural Computation
Controlling the magnification factor of self-organizing feature maps
Neural Computation
Magnification control in winner relaxing neural gas
Neurocomputing
The effect of concave and convex weight adjustments on self-organizing maps
IEEE Transactions on Neural Networks
Codeword distribution for frequency sensitive competitive learning with one-dimensional input data
IEEE Transactions on Neural Networks
Asymptotic level density in topological feature maps
IEEE Transactions on Neural Networks
Fuzzy classification by fuzzy labeled neural gas
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Magnification control for batch neural gas
Neurocomputing
Self-Organizing Maps with Asymmetric Neighborhood Function
Neural Computation
Neural Networks
Some Theoretical Aspects of the Neural Gas Vector Quantizer
Similarity-Based Clustering
Unleashing Pearson Correlation for Faithful Analysis of Biomedical Data
Similarity-Based Clustering
Learning Highly Structured Manifolds: Harnessing the Power of SOMs
Similarity-Based Clustering
Functional Principal Component Learning Using Oja's Method and Sobolev Norms
WSOM '09 Proceedings of the 7th International Workshop on Advances in Self-Organizing Maps
Divergence based online learning in vector quantization
ICAISC'10 Proceedings of the 10th international conference on Artificial intelligence and soft computing: Part I
Neurocomputing
Prototype based classification using information theoretic learning
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
Perspectives of self-adapted self-organizing clustering in organic computing
BioADIT'06 Proceedings of the Second international conference on Biologically Inspired Approaches to Advanced Information Technology
Investigation of topographical stability of the concave and convex self-organizing map variant
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Cartogram visualization for nonlinear manifold learning models
Data Mining and Knowledge Discovery
Magnitude Sensitive Competitive Learning
Neurocomputing
Hi-index | 0.00 |
We consider different ways to control the magnification in self-organizing maps (SOM) and neural gas (NG). Starting from early approaches of magnification control in vector quantization, we then concentrate on different approaches for SOM and NG. We show that three structurally similar approaches can be applied to both algorithms that are localized learning, concave-convex learning, and winner-relaxing learning. Thereby, the approach of concave-convex learning in SOM is extended to a more general description, whereas the concave-convex learning for NG is new. In general, the control mechanisms generate only slightly different behavior comparing both neural algorithms. However, we emphasize that the NG results are valid for any data dimension, whereas in the SOM case, the results hold only for the one-dimensional case.