Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
Neural Computation
Biological Cybernetics - Special Issue: Quantitative Neuron Modeling
The computational structure of spike trains
Neural Computation
Mutual information and redundancy in spontaneous communication between cortical neurons
Biological Cybernetics
Hi-index | 0.01 |
Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input-output correlations.