Statistical location detection with sensor networks
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Toward a theory of information processing
Signal Processing
Privacy-preserving multimodal person and object identification
Proceedings of the 10th ACM workshop on Multimedia and security
Information theory and neural information processing
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Quantitative information flow, with a view
ESORICS'11 Proceedings of the 16th European conference on Research in computer security
Worst- and average-case privacy breaches in randomization mechanisms
TCS'12 Proceedings of the 7th IFIP TC 1/WG 202 international conference on Theoretical Computer Science
Multiple objects: error exponents in hypotheses testing and identification
Information Theory, Combinatorics, and Search Theory
Hi-index | 754.84 |
In two-hypothesis detection problems with i.i.d. observations, the minimum error probability decays exponentially with the amount of data, with the constant in the exponent equal to the Chernoff distance between the probability distributions characterizing the hypotheses. We extend this result to the general M-hypothesis Bayesian detection problem where zero cost is assigned to correct decisions, and find that the Bayesian cost function's exponential decay constant equals the minimum Chernoff distance among all distinct pairs of hypothesized probability distributions