Neural Computation
Hi-index | 0.01 |
If there are many statistically independent excitatory inputs to a neuron, its net excitation is binomially distributed, and this distribution is well-approximated by a Gaussian because there are so many inputs. Because there are so many inputs, quantal failures are essentially harmless. However, the presumption of statistical independence is too simplistic. To reflect statistical dependence among the inputs, we consider mixture distributions. Generally, mixture distributions are a distributional class that can be far from Gaussian even though the individual component distributions might, themselves, be Gaussian. Here we show when quantal synaptic failures can move the kurtosis of a mixture distribution towards the Gaussian value.