Spikes: exploring the neural code
Spikes: exploring the neural code
On decoding the responses of a population of neurons from short time windows
Neural Computation
Spiking Neuron Models: An Introduction
Spiking Neuron Models: An Introduction
Information loss in an optimal maximum likelihood decoding
Neural Computation
Some Improved Sample Complexity Bounds in the Probabilistic PAC Learning Model
ALT '92 Proceedings of the Third Workshop on Algorithmic Learning Theory
Estimation of entropy and mutual information
Neural Computation
Convex Optimization
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Anthropic correction of information estimates and its application to neural coding
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
A Probabilistic Upper Bound on Differential Entropy
IEEE Transactions on Information Theory
IEEE Transactions on Image Processing
Hi-index | 0.00 |
For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano's inequality where the continuous random variable is discretized.