Intrinsic Randomness Problem in the Framework of Slepian-Wolf Separate Coding System
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Universal simulation with fidelity criteria
IEEE Transactions on Information Theory
Source and channel simulation using arbitrary randomness
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
IEEE Transactions on Information Theory
Twice-universal simulation of Markov sources and individual sequences
IEEE Transactions on Information Theory
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
Hi-index | 755.08 |
We study the randomness necessary for the simulation of a random process with given distributions, on terms of the finite-precision resolvability of the process. Finite-precision resolvability is defined as the minimal random-bit rate required by the simulator as a function of the accuracy with which the distributions are replicated. The accuracy is quantified by means of various measures: variational distance, divergence, Orstein (1973), Prohorov (1956) and related measures of distance between the distributions of random process. In the case of Ornstein, Prohorov and other distances of the Kantorovich-Vasershtein type, we show that the finite-precision resolvability is equal to the rate-distortion function with a fidelity criterion derived from the accuracy measure. This connection leads to new results on nonstationary rate-distortion theory. In the case of variational distance, the resolvability of stationary ergodic processes is shown to equal entropy rate regardless of the allowed accuracy. In the case of normalized divergence, explicit expressions for finite-precision resolvability are obtained in many cases of interest; and connections with data compression with minimum probability of block error are shown