A self-organizing map with homeostatic synaptic scaling

  • Authors:
  • Thomas J. Sullivan;Virginia R. de Sa

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of California, San Diego, USA;Department of Cognitive Science, University of California, San Diego, USA

  • Venue:
  • Neurocomputing
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Hebbian learning has been a staple of neural-network models for many years. It is well known that the most straight-forward implementations of this popular learning rule lead to unconstrained weight growth. A newly discovered property of cortical neurons is that they try to maintain a preset average firing rate [G.G. Turrigiano, S.B. Nelson, Homeostatic plasticity in the developing nervous system, Nat. Rev. Neurosci. 5 (2004) 97-107]. We use this property to control the Hebbian learning process in a self-organizing map network. In this article, the practicality of this type of learning rule is expanded by deriving a scaling equation for the learning rates for various network architectures.