On the relationship between the information measures and the bayes probability of error
IEEE Transactions on Information Theory
Elements of information theory
Elements of information theory
Spikes: exploring the neural code
Spikes: exploring the neural code
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Convex Optimization
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Predictability, Complexity, and Learning
Neural Computation
Neural Computation
Estimating entropy on m bins given fewer than m samples
IEEE Transactions on Information Theory
Indices for testing neural codes
Neural Computation
A channel model for inferring the optimal number of electrodes for future cochlear implants
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Hi-index | 0.00 |
Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.