The role of constraints in Hebbian learning
Neural Computation
The role of weight normalization in competitive learning
Neural Computation
Self-Organizing Maps
Hi-index | 0.00 |
We propose a simple topographic mapping formation model between cell layers with weight normalization. In our model, each cell layer can have an arbitrary neighborhood relation between the cells represented by an undirected graph. Thus, a topographic mapping described in this model is a map which preserves the adjacency relation. We define several learning rules, input and output type weight normalization methods. Then, we not only concentrate on a Hebbean weight modification but also investigate the effects of normalization under a non-Hebbean weight modification. We first show that when an input type normalization is adopted or without normalization, a topographic mapping is stable under the correlational type learning rule, but when an output type normalization is adopted a topographic mapping is stable under not only the correlational type learning rule but also the non-correlational one. Next, we show by computer simulations that when an output type normalization is considered we have more learning rules which yield topographic mappings than the cases when an input type normalization is adopted or without normalization.