An upper bound on the informational capacity of a synapse

  • Authors:
  • Warren S. McCulloch

  • Affiliations:
  • Illinois University Neuropsychiatric Institute

  • Venue:
  • ACM '52 Proceedings of the 1952 ACM national meeting (Pittsburgh)
  • Year:
  • 1952

Quantified Score

Hi-index 0.00

Visualization

Abstract

The paper I want to read to you is concerned with the relative efficiency of two modes of transmitting information through such computing machines as you have in your heads. Since you are not neurophysiologists I shall first describe the essential components of brains. These are called neurons, Each is a living creature and consequently has many properties which are irrelevant to our present discussion. Imagine a neuron as shaped something like a turnip, having fronds called dendrites, a body or soma and a long thin taproot -- its axone. The biggest of these neurons has fronds say a centimeter long, a body one tenth of a millimeter in diameter, a taproot a few micra thick and anything up to two meters long. Neurons are generally connected by branches of the taproot one ending on or among the dendrites or as knobs on the soma of the next neuron. The conduction of signals through this system is normally down the axone of one cell to the dendrites or cell body of the next, and the effective junction which makes this possible is called a synapse. We shall eventually be considering the rate at which information can be transmitted through a synapse. For this it is necessary for you to know something of the nature of the signals.