Universal estimation of erasure entropy
IEEE Transactions on Information Theory
Multiple description coding of discrete ergodic sources
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Journal of Computational Neuroscience
Hi-index | 754.90 |
This paper proposes a new algorithm based on the Context-Tree Weighting (CTW) method for universal compression of a finite-alphabet sequence x1 n with side information y1 n available to both the encoder and decoder. We prove that with probability one the compression ratio converges to the conditional entropy rate for jointly stationary ergodic sources. Experimental results with Markov chains and English texts show the effectiveness of the algorithm