Marr's theory of the neocortex as a self-organizing neural network
Neural Computation
Fixed-point attractor analysis for a class of neurodynamics
Neural Computation
An analysis of synaptic normalization in a general class of Hebbian models
Neural Computation
Synaptic weight normalization effects for topographic mapping formation
Neural Networks - 2004 Special issue: New developments in self-organizing systems
Effective Neuronal Learning with Ineffective Hebbian Learning Rules
Neural Computation
Cross-talk induces bifurcations in nonlinear models of synaptic plasticity
Neural Computation
Computational Intelligence and Neuroscience
Hi-index | 0.00 |
The effect of different kinds of weight normalization on theoutcome of a simple competitive learning rule is analyzed. It isshown that there are important differences in the representationformed depending on whether the constraint is enforced by dividingeach weight by the same amount ("divisive enforcement") orsubtracting a fixed amount from each weight ("subtractiveenforcement"). For the divisive cases weight vectors spread outover the space so as to evenly represent "typical" inputs, whereasfor the subtractive cases the weight vectors tend to the axes ofthe space, so as to represent "extreme" inputs. The consequences ofthese differences are examined.