Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
General Theory of Information Transfer and Combinatorics
Hypothesis testing for arbitrarily varying source with exponential-type constraint
IEEE Transactions on Information Theory
The method of types [information theory]
IEEE Transactions on Information Theory
Multiple objects: error exponents in hypotheses testing and identification
Information Theory, Combinatorics, and Search Theory
Hi-index | 0.00 |
Two problems concerning arbitrarily varying stationary Markov source (AVMS), namely, the binary hypothesis testing and the source coding problems are solved. First, we consider a logarithmically asymptotically optimal (LAO) hypothesis testing (HT) for distributions of AVMS. The asymptotic behavior of the second type error probability exponent is investigated in function of the first type error probability exponent, as the number of observations tends to infinity. In the problem of AVMS coding, the E-optimal rate function R(E) (the minimum rate R of the source sequences compression when the decoding error probability is less than exp{-NE}, (E 0) and its inverse reliability function E(R) are obtained from the corresponding HT result.