Information-geometric measure for neural spikes
Neural Computation
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
Difficulty of Singularity in Population Coding
Neural Computation
Spike-timing-dependent plasticity in small-world networks
Neurocomputing
Solution Method Using Correlated Noise for TSP
Neural Information Processing
Stochasticity in localized synfire chain
Neurocomputing
Conditional mixture model for correlated neuronal spikes
Neural Computation
CuBIC: cumulant based inference of higher-order correlations in massively parallel spike trains
Journal of Computational Neuroscience
Population coding, bayesian inference and information geometry
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Quantifying statistical interdependence, part iii: N 2 point processes
Neural Computation
Dreaming of mathematical neuroscience for half a century
Neural Networks
Hi-index | 0.00 |
The stochastic mechanism of synchronous firing in a population of neurons is studied from the point of view of information geometry. Higher-order interactions of neurons, which cannot be reduced to pairwise correlations, are proved to exist in synchronous firing. In a neuron pool where each neuron fires stochastically, the probability distribution q(r) of the activity r, which is the fraction of firing neurons in the pool, is studied. When q(r) has a widespread distribution, in particular, when q(r) has two peaks, the neurons fire synchronously at one time and are quiescent at other times. The mechanism of generating such a probability distribution is interesting because the activity r is concentrated on its mean value when each neuron fires independently, because of the law of large numbers. Even when pairwise interactions, or third-order interactions, exist, the concentration is not resolved. This shows that higher-order interactions are necessary to generate widespread activity distributions. We analyze a simple model in which neurons receive common overlapping inputs and prove that such a model can have a widespread distribution of activity, generating higher-order stochastic interactions.