Elements of information theory
Elements of information theory
Coding Theorems of Information Theory
Coding Theorems of Information Theory
DCC '04 Proceedings of the Conference on Data Compression
Source coding and graph entropies
IEEE Transactions on Information Theory
The redundancy of source coding with a fidelity criterion. 1. Known statistics
IEEE Transactions on Information Theory
The method of types [information theory]
IEEE Transactions on Information Theory
On zero-error source coding with decoder side information
IEEE Transactions on Information Theory
Distributed source coding using syndromes (DISCUS): design and construction
IEEE Transactions on Information Theory
On code design for the Slepian-Wolf problem and lossless multiterminal networks
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the linear codebook-level duality between Slepian-Wolf coding and channel coding
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 754.96 |
In this paper, the redundancy of both variable and fixed rate Slepian-Wolf coding is considered. Given any jointly memoryless source-side information pair {(Xi, Yi)}i=1∞ with finite alphabet, the redundancy Rn(Ɛn) of variable rate Slepian-Wolf coding of X1n with decoder only side information Y1n depends on both the block length n and the decoding block error probability Ɛn, and is defined as the difference between the minimum average compression rate of order n variable rate Slepian-Wolf codes having the decoding block error probability less than or equal to Ɛn, and the conditional entropy H(X|Y), where H(X|Y) is the conditional entropy rate of the source given the side information. The redundancy of fixed rate Slepian-Wolf coding of X1n with decoder only side information Y1n is defined similarly and denoted by RFn(Ɛn). It is proved that under mild assumptions about Ɛn, Rn(Ɛn) = dv√- log Ɛn/n + o(√- log Ɛn/n) and RFn(Ɛn) = df√- log Ɛn/n + o(√- log Ɛn/n), where df and dv are two constants completely determined by the joint distribution of the source-side information pair. Since dv is generally smaller than df, our results show that variable rate Slepian-Wolf coding is indeed more efficient than fixed rate Slepian-Wolf coding.