Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Simulation of random processes and rate-distortion theory
IEEE Transactions on Information Theory
Universal portfolios with side information
IEEE Transactions on Information Theory
Interval algorithm for random number generation
IEEE Transactions on Information Theory
Separation of random number generation and resolvability
IEEE Transactions on Information Theory
On universal simulation of information sources using training data
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Addendum to “On Universal Simulation of Information Sources Using Training Data”
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Universal Delay-Limited Simulation
IEEE Transactions on Information Theory
Twice-universal simulation of Markov sources and individual sequences
IEEE Transactions on Information Theory
Hi-index | 754.90 |
We consider the problem of universal simulation of a memoryless source (with some partial extensions to Markov sources), based on a training sequence emitted from the source. The objective is to maximize the conditional entropy of the simulated sequence given the training sequence, subject to a certain distance constraint between the probability distribution of the output sequence and the probability distribution of the input, training sequence. We derive, for several distance criteria, single-letter expressions for the maximum attainable conditional entropy as well as corresponding universal simulation schemes that asymptotically attain these maxima.