Principles and practice of information theory
Principles and practice of information theory
An application of the principle of maximum information preservation to linear systems
Advances in neural information processing systems 1
Elements of information theory
Elements of information theory
Mutual information, Fisher information, and population coding
Neural Computation
Spikes: exploring the neural code
Spikes: exploring the neural code
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Enhancement of information transmission efficiency by synaptic failures
Neural Computation
Neural Computation
Neural Computation
A new multineuron spike train metric
Neural Computation
Indices for testing neural codes
Neural Computation
Fisher information and stochastic complexity
IEEE Transactions on Information Theory
Isometric coding of spiking haptic signals by peripheral somatosensory neurons
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
A closed-loop neurorobotic system for investigating braille-reading finger kinematics
EuroHaptics'12 Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part I
Hi-index | 0.00 |
We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors'' responses convey enough information to perform optimal discrimination----defined as maximum metrical information and zero conditional entropy----of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision.