Unsupervised learning of relations

  • Authors:
  • Matthew Cook;Florian Jug;Christoph Krautz;Angelika Steger

  • Affiliations:
  • Institute of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland;Institute of Theoretical Computer Science, ETH Zurich, Switzerland;Institute of Theoretical Computer Science, ETH Zurich, Switzerland;Institute of Theoretical Computer Science, ETH Zurich, Switzerland

  • Venue:
  • ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Learning processes allow the central nervous system to learn relationships between stimuli. Even stimuli from different modalities can easily be associated, and these associations can include the learning of mappings between observable parameters of the stimuli. The data structures and processing methods of the brain, however, remain very poorly understood. We investigate the ability of simple, biologically plausible processing mechanisms to learn such relationships when the data is represented using population codes, a coding scheme that has been found in a variety of cortical areas. We require that the relationships are learned not just from the point of view of an omniscient observer, but rather the network itself must be able to make effective use of the learned relationship, within the population code representations. Using a form of Hebbian learning, local winner-take-all, and homeostatic activity regulation away from the periphery, we obtain a learning framework which is able to learn relationships from examples and then use the learned relationships for a variety of routine nervous system tasks such as inference, de-noising, cue-integration, and decision making.