Lattice Quantization with Side Information
DCC '00 Proceedings of the Conference on Data Compression
Design of Optimal Quantizers for Distributed Source Coding
DCC '03 Proceedings of the Conference on Data Compression
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
An Implementable Scheme for Universal Lossy Compression of Discrete Markov Sources
DCC '09 Proceedings of the 2009 Data Compression Conference
A context quantization approach to universal denoising
IEEE Transactions on Signal Processing
Discrete denoising with shifts
IEEE Transactions on Information Theory
Algorithms for discrete denoising under channel uncertainty
IEEE Transactions on Signal Processing - Part I
Simple universal lossy data compression schemes derived from the Lempel-Ziv algorithm
IEEE Transactions on Information Theory
Systematic lossy source/channel coding
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
Distributed source coding using syndromes (DISCUS): design and construction
IEEE Transactions on Information Theory
Universal discrete denoising: known channel
IEEE Transactions on Information Theory
On the Wyner-Ziv problem for individual sequences
IEEE Transactions on Information Theory
Universal Minimax Discrete Denoising Under Channel Uncertainty
IEEE Transactions on Information Theory
Schemes for Bidirectional Modeling of Discrete Stationary Sources
IEEE Transactions on Information Theory
Universal Denoising of Discrete-Time Continuous-Amplitude Signals
IEEE Transactions on Information Theory
The source-channel separation theorem revisited
IEEE Transactions on Information Theory
Hi-index | 754.84 |
We consider the Wyner-Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of sliding-window processing followed by Lempel-Ziv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practicalWZcoding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes.