How do the amplitude fluctuations affect the neuronal transmission efficiency

  • Authors:
  • Bartosz Paprocki;Janusz Szczepanski

  • Affiliations:
  • Institute of Mechanics and Applied Computer Science, Kazimierz Wielki University, Bydgoszcz, Poland;Institute of Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5B, 02-106 Warsaw, Poland and Institute of Mechanics and Applied Computer Science, Kazimierz Wielki Univers ...

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input-output correlations.