Neural networks, orientations of the hypercube, and algebraic threshold functions

  • Authors:
  • P. Baldi

  • Affiliations:
  • California Inst. of Technol., Pasadena, CA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

A class of possible generalizations of current neural networks models is described using local improvement algorithms and orientations of graphs. A notation of dynamical capacity is defined and, by computing bounds on the number of algebraic threshold functions, it is proven that for neural networks of size n and energy function of degree d, this capacity is O(nd+1). Stable states are studied, and it is shown that for the same networks the storage capacity is O(nd+1). In the case of random orientations, it is proven that the expected number of stable states is exponential. Applications to coding theory are indicated, and it is shown that usual codes can be embedded in neural networks but only at high cost. Cycles and their storage are also examined