Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
On arbitrarily varying Markov source coding and hypothesis LAO testing by non-informed statistician
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
General Theory of Information Transfer and Combinatorics
On logarithmically asymptotically optimal testing of hypotheses and identification
General Theory of Information Transfer and Combinatorics
Multiple objects: error exponents in hypotheses testing and identification
Information Theory, Combinatorics, and Search Theory
Hi-index | 754.84 |
Hypothesis testing for an arbitrarily varying source (AVS) is considered. We determine the best asymptotic exponent of the probability of error of the second kind when the first kind error probability is less than 2-nr. This result generalizes the well-known theorem of Hoeffding (1965), Blahut (1974), Csiszar and Longo (1971) for hypothesis testing with an exponential-type constraint. As a corollary in information theory, the best asymptotic error exponent and the r-optimal rate (the minimum compression rate when the error probability is less than 2-nr, r⩾0) of AVS coding are determined