Cores of cooperative games in information theory
EURASIP Journal on Wireless Communications and Networking - Theory and Applications in Multiuser/Multiterminal Communications
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
On the entropy of compound distributions on nonnegative integers
IEEE Transactions on Information Theory
A criterion for the compound Poisson distribution to be maximum entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
The Gaussian many-to-one interference channel with confidential messages
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
The entropy power of a sum is fractionally superadditive
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Monotonic convergence in an information-theoretic law of small numbers
IEEE Transactions on Information Theory
Counterexamples to a proposed stam inequality on finite groups
IEEE Transactions on Information Theory
Information inequalities for joint distributions, with interpretations and applications
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Monotonicity, thinning, and discrete versions of the entropy power inequality
IEEE Transactions on Information Theory
Two-hop secure communication using an untrusted relay
EURASIP Journal on Wireless Communications and Networking - Special issue on wireless physical layer security
Robust copyright marking using Weibull distribution
Computers and Electrical Engineering
Maximum-information storage system: concept, implementation and application
Proceedings of the International Conference on Computer-Aided Design
Schur-Convexity on generalized information entropy and its applications
ICICA'11 Proceedings of the Second international conference on Information Computing and Applications
Entropy and set cardinality inequalities for partition-determined functions
Random Structures & Algorithms
Discrete Applied Mathematics
Hi-index | 755.20 |
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.