The upward bias in measures of information derived from limited data samples
Neural Computation
Information-geometric measure for neural spikes
Neural Computation
Estimation of entropy and mutual information
Neural Computation
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Population Coding with Correlation and an Unfaithful Model
Neural Computation
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Population coding is the quantitative study of which algorithms or representations are used by the brain to combine together and evaluate the messages carried by different neurons. Here, we review an information-theoretic approach to population coding. We first discuss how to compute the information carried by simultaneously recorded neural populations, and in particular how to reduce the limited sampling bias which affects the calculation of information from a limited amount of experimental data. We then discuss how to quantify the contribution of individual members of the population, or the interaction between them, to the overall information encoded by the considered group of neurons. We focus in particular on evaluating what is the contribution of interactions up to any given order to the total information. We illustrate this formalism with applications to simulated data with realistic neuronal statistics and to real simultaneous recordings of multiple spike trains.