A novel stochastic learning rule for neural networks

  • Authors:
  • Frank Emmert-Streib

  • Affiliations:
  • Institut für Theoretische Physik, Universität Bremen, Bremen, Germany

  • Venue:
  • ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The purpose of this article is the introduction of a novel stochastic Hebb-like learning rule for neural networks which combines features of unsupervised (Hebbian) and supervised (reinforcement) learning. This learning rule is stochastic with respect to the selection of the time points when a synaptic modification is induced by simultantious activation of the pre- and postsynaptic neuron. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron which is called homosynaptic plasticity but effects also further remote synapses of the pre- and postsynaptic neuron. This more complex form of plasticity has recently come into the light of interest of experimental investigations in neurobiology and is called heterosynaptic plasticity. Our learning rule is motivated by these experimental findings and gives a qualitative explanation of this kind of synaptic plasticity. Additionally, we give some numerical results that demonstrate that our learning rule works well in training neural networks, even in the presence of noise.