A new entropy power inequality
IEEE Transactions on Information Theory
A note on the secrecy capacity of the multiple-antenna wiretap channel
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A vector generalization of costa's entropy-power inequality with applications
IEEE Transactions on Information Theory
The rate-distortion function for the quadratic Gaussian CEO problem
IEEE Transactions on Information Theory
The worst additive noise under a covariance constraint
IEEE Transactions on Information Theory
Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels
IEEE Transactions on Information Theory
Gradient of mutual information in linear vector Gaussian channels
IEEE Transactions on Information Theory
The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
IEEE Transactions on Information Theory
An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
IEEE Transactions on Information Theory
The Secrecy Capacity Region of the Gaussian MIMO Multi-Receiver Wiretap Channel
IEEE Transactions on Information Theory
Secrecy capacity region of the degraded compound multi-receiver wiretap channel
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Hi-index | 0.00 |
This paper considers an entropy-power inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and the minimum mean-square error (MMSE) estimate in linear vector Gaussian channels. As an application, a new extremal entropy inequality is derived from the generalized Costa EPI and then used to establish the secrecy capacity regions of the degraded vector Gaussian broadcast channel with layered confidential messages.