Elements of information theory
Elements of information theory
The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
IEEE Transactions on Information Theory
The Distributed Karhunen–Loève Transform
IEEE Transactions on Information Theory
An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
IEEE Transactions on Information Theory
Communication Via Decentralized Processing
IEEE Transactions on Information Theory
Side information aware coding strategies for sensor networks
IEEE Journal on Selected Areas in Communications
Hi-index | 754.84 |
Let X,Y,Z be zero-mean, jointly Gaussian random vectors of dimensions nx, ny, and nz respectively. Let P be the set of random variables W such that W ↔ Y ↔ (X,Z) is a Markov string. We consider the following optimization problem:minW∈PI(Y;W\Z)subject to one of the following two possible constraints: 1)I(X;W\Z), ≥ R1 and 2) the mean squared error between Xand X = E(X\W,Z) is less than d. The problem under the first kind of constraint is motivated by multiple-input multiple-output (MIMO) relay channels with an oblivious transmitter and a relay connected to the receiver through a dedicated link, while for the second case, it is motivated by source coding with decoder side information where the sensor observation is noisy. In both cases, we show that jointly Gaussian solutions are optimal. Moreover, explicit water filling interpretations are given for both cases, which suggest transform coding approaches performed in different transform domains, and that the optimal solution for one problem is, in general, suboptimal for the other.