General theory of information transfer: Updated
Discrete Applied Mathematics
Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
Estimating with randomized encoding the joint empirical distribution in a correlated source
General Theory of Information Transfer and Combinatorics
On logarithmically asymptotically optimal testing of hypotheses and identification
General Theory of Information Transfer and Combinatorics
Bibliography of publications by Rudolf Ahlswede
Information Theory, Combinatorics, and Search Theory
Hi-index | 754.84 |
A new coding problem is introduced for a correlated source (Xn,Yn)n=1∞. The observer of Xn can transmit data depending on Xn at a prescribed rate R. Based on these data the observer of Yn tries to identify whether for some distortion measure ρ (like the Hamming distance) n-1 ρ(Xn,Y n)⩽d, a prescribed fidelity criterion. We investigate as functions of R and d the exponents of two error probabilities, the probabilities for misacceptance, and the probabilities for misrejection. In the case where Xn and Yn are independent, we completely characterize the achievable region for the rate R and the exponents of two error probabilities; in the case where Xn and Yn are correlated, we get some interesting partial results for the achievable region. During the process, we develop a new method for proving converses, which is called “the inherently typical subset lemma”. This new method goes considerably beyond the “entropy characterization” the “image size characterization,” and its extensions. It is conceivable that this new method has a strong impact on multiuser information theory