Anthropic correction of information estimates and its application to neural coding

  • Authors:
  • Michael C. Gastpar;Patrick R. Gill;Alexander G. Huth;Frédéric E. Theunissen

  • Affiliations:
  • Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, Berkeley, CA and Faculty of Electrical Engineering, Mathematics and Computer Science, Delft Universi ...;School of Electrical and Computer Engineering, Cornell University, Ithaca, NY;Helen Wills Neuroscience Institute, University of California, Berkeley, CA;Helen Wills Neuroscience Institute and The Department of Psychology, University of California, Berkeley, CA

  • Venue:
  • IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Information theory has been used as an organizing principle in neuroscience for several decades. Estimates of the mutual information (MI) between signals acquired in neurophysiological experiments are believed to yield insights into the structure of the underlying information processing architectures.With the pervasive availability of recordings from many neurons, several information and redundancy measures have been proposed in the recent literature. A typical scenario is that only a small number of stimuli can be tested, while ample response data may be available for each of the tested stimuli. The resulting asymmetric information estimation problem is considered. It is shown that the direct plug-in information estimate has a negative bias. An anthropic correction is introduced that has a positive bias. These two complementary estimators and their combinations are natural candidates for information estimation in neuroscience. Tail and variance bounds are given for both estimates. The proposed information estimates are applied to the analysis of neural discrimination and redundancy in the avian auditory system.