Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
An MMSE approach to the secrecy capacity of the MIMO Gaussian wiretap channel
EURASIP Journal on Wireless Communications and Networking - Special issue on wireless physical layer security
On the entropy of compound distributions on nonnegative integers
IEEE Transactions on Information Theory
A criterion for the compound Poisson distribution to be maximum entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Mismatched estimation and relative entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
The entropy power of a sum is fractionally superadditive
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Monotonic convergence in an information-theoretic law of small numbers
IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Monotonicity, thinning, and discrete versions of the entropy power inequality
IEEE Transactions on Information Theory
Discrete Applied Mathematics
Hi-index | 755.14 |
Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels. As Artstein , we also deal with the more general setting of nonidentically distributed random variables