Information theory and neural information processing

  • Authors:
  • Don H. Johnson

  • Affiliations:
  • Department of Electrical and Computer Engineering, Rice University, Houston, TX

  • Venue:
  • IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Neuroscientists want to quantify how well neurons, individually and collectively, process information and encode the result in their outputs. We demonstrate that while classic information theory demarcates optimal performance boundaries, it does not provide results that would be useful in analyzing an existing system about which little is known (such as the brain). In the classical vein, non-Poisson channels, which describe the communication medium for neural signals, are shown to have individually a capacity strictly smaller than the Poisson ideal. We describe recent capacity results for Poisson neural populations, showing that connections among neurons can increase capacity. We then present an alternative theory more amenable to data analysis and to situations wherein systems actively extract and represent information. Using this theory, we show that the ability of a neural population to jointly represent information depends nature of its input signal, not on the encoded information.