The multiinformation function as a tool for measuring stachastic dependence
Learning in graphical models
Spikes: exploring the neural code
Spikes: exploring the neural code
Neural Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Graph Theory
Hypergraph-Based Anomaly Detection of High-Dimensional Co-Occurrences
IEEE Transactions on Pattern Analysis and Machine Intelligence
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
The Ising decoder: reading out the activity of large neural ensembles
Journal of Computational Neuroscience
Hi-index | 0.00 |
An essential step towards understanding how the brain orchestrates information processing at the cellular and population levels is to simultaneously observe the spiking activity of cortical neurons that mediate perception, learning, and motor processing. In this paper, we formulate an information theoretic approach to determine whether cooperation among neurons may constitute a governing mechanism of information processing when encoding external covariates. Specifically, we show that conditional independence between neuronal outputs may not provide an optimal encoding strategy when the firing probability of a neuron depends on the history of firing of other neurons connected to it. Rather, cooperation among neurons can provide a "message-passing" mechanism that preserves most of the information in the covariates under specific constraints governing their connectivity structure. Using a biologically plausible statistical learning model, we demonstrate the performance of the proposed approach in synergistically encoding a motor task using a subset of neurons drawn randomly from a large population. We demonstrate its superiority in approximating the joint density of the population from limited data compared to a statistically independent model and a pairwise maximum entropy (MaxEnt) model.