Emerging Bayesian priors in a self-organizing recurrent network

  • Authors:
  • Andreea Lazar;Gordon Pipa;Jochen Triesch

  • Affiliations:
  • Max-Planck Institute for Brain Research, Frankfurt am Main, Germany;Institute of Cognitive Science, University of Osnabrueck, Osnabrueck, Germany;Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany

  • Venue:
  • ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We explore the role of local plasticity rules in learning statistical priors in a self-organizing recurrent neural network (SORN). The network receives input sequences composed of different symbols and learns the structure embedded in these sequences via a simple spike-timing-dependent plasticity rule, while synaptic normalization and intrinsic plasticity maintain a low level of activity. After learning, the network exhibits spontaneous activity that matches the stimulus-evoked activity during training and thus can be interpreted as samples from the network's prior probability distribution over evoked activity states. Further, we show how learning the frequency and spatio-temporal characteristics of the input sequences influences network performance in several classification tasks. These results suggest a novel connection between low level learning mechanisms and high level concepts of statistical inference.