Elements of information theory
Elements of information theory
IEEE Transactions on Information Theory
Gaussian multiterminal source coding
IEEE Transactions on Information Theory
Statistical inference under multiterminal data compression
IEEE Transactions on Information Theory
The worst additive noise under a covariance constraint
IEEE Transactions on Information Theory
Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding
IEEE Transactions on Information Theory
The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
IEEE Transactions on Information Theory
An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
IEEE Transactions on Information Theory
Successive Refinement for Hypothesis Testing and Lossless One-Helper Problem
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We study the vector Gaussian versions of two problems: hypothesis testing under a communication constraint and the lossy one-helper problem. In the hypothesis testing problem, a test against independence is considered when a vector Gaussian source is available at the detector which receives a message about another vector Gaussian source at a specified rate. Two equivalent characterizations of the optimal type 2 error exponent are given when the type 1 error is at most a fixed constant. The first characterization is based on enhancement technique introduced by Weingarten et. al. and the other is transform-based. The transform-based characterization directly yields a water pouring interpretation, and establishes successive refinability. For the lossy one-helper problem, we determine a portion of the boundary of the rate region.