Neuron-Less Neural-Like Networks with Exponential Association Capacity at Tabula Rasa

  • Authors:
  • Demian Battaglia

  • Affiliations:
  • Department of Nonlinear Dynamics, Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany D37073 and Bernstein Center for Computational Neuroscience, Göttingen, Germa ...

  • Venue:
  • IWINAC '09 Proceedings of the 3rd International Work-Conference on The Interplay Between Natural and Artificial Computation: Part I: Methods and Models in Artificial and Natural Computation. A Homage to Professor Mira's Scientific Legacy
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Artificial neural networks have been used as models of associative memory but their storage capacity is severely limited. Alternative machine-learning approaches perform better in classification tasks but require long learning sessions to build an optimized representational space. Here we present a radically new approach to the problem of classification based on the fact that networks associated to random hard constraint satisfaction problems display naturally an exponentially large number of attractor clusters. We introduce a warning propagation dynamics that allows selective mapping of arbitrary input vector onto these well-separated clusters of states, without need of training. The potential for such networks with exponential capacity to handle inputs with a combinatorially complex structure is finally explored with a toy-example.