IEEE Transactions on Information Theory
Sphere-packings, lattices, and groups
Sphere-packings, lattices, and groups
Elements of information theory
Elements of information theory
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
The CEO problem [multiterminal source coding]
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The quadratic Gaussian CEO problem
IEEE Transactions on Information Theory
Averaging bounds for lattices and linear codes
IEEE Transactions on Information Theory
Gaussian multiterminal source coding
IEEE Transactions on Information Theory
Systematic lossy source/channel coding
IEEE Transactions on Information Theory
The rate-distortion function for the quadratic Gaussian CEO problem
IEEE Transactions on Information Theory
Multiterminal source coding with high resolution
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Multiple-description vector quantization with lattice codebooks: design and analysis
IEEE Transactions on Information Theory
Asymmetric multiple description lattice vector quantizers
IEEE Transactions on Information Theory
Dithered lattice-based quantizers for multiple descriptions
IEEE Transactions on Information Theory
Multiple description vector quantization with a coarse lattice
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding
IEEE Transactions on Information Theory
The Wyner-Ziv problem with multiple sources
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Lattices which are good for (almost) everything
IEEE Transactions on Information Theory
Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem
IEEE Transactions on Information Theory
Distributed MIMO receiver: achievable rates and upper bounds
IEEE Transactions on Information Theory
On the loss of single-letter characterization: the dirty multiple access channel
IEEE Transactions on Information Theory
On the capacity of multi-user cognitive radio networks
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Distributed transmission of functions of correlated sources over a fading multiple access channel
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
The Gaussian many-help-one distributed source coding problem
IEEE Transactions on Information Theory
N-channel asymmetric entropy-constrained multiple-description lattice vector quantization
IEEE Transactions on Information Theory
Hi-index | 755.08 |
Consider a pair of correlated Gaussian sources (X1,X2). Two separate encoders observe the two components and communicate compressed versions of their observations to a common decoder. The decoder is interested in reconstructing a linear combination of X1 and X2 to within a mean-square distortion of D. We obtain an inner bound to the optimal rate-distortion region for this problem. A portion of this inner bound is achieved by a scheme that reconstructs the linear function directly rather than reconstructing the individual components X1 and X2 first. This results in a better rate region for certain parameter values. Our coding scheme relies on lattice coding techniques in contrast to more prevalent random coding arguments used to demonstrate achievable rate regions in information theory. We then consider the case of linear reconstruction of K sources and provide an inner bound to the optimal rate-distortion region. Some parts of the inner bound are achieved using the following coding structure: lattice vector quantization followed by "correlated" lattice-structured binning.