Self-Organizing Maps
Homeostatic synaptic scaling in self-organizing maps
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Hi-index | 0.01 |
Hebbian learning has been a staple of neural-network models for many years. It is well known that the most straight-forward implementations of this popular learning rule lead to unconstrained weight growth. A newly discovered property of cortical neurons is that they try to maintain a preset average firing rate [G.G. Turrigiano, S.B. Nelson, Homeostatic plasticity in the developing nervous system, Nat. Rev. Neurosci. 5 (2004) 97-107]. We use this property to control the Hebbian learning process in a self-organizing map network. In this article, the practicality of this type of learning rule is expanded by deriving a scaling equation for the learning rates for various network architectures.