Conditional Independences among Four Random Variables III: Final Conclusion
Combinatorics, Probability and Computing
On the capacity of information networks
SODA '06 Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm
On the capacity of information networks
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
An outer bound for multisource multisink network coding with minimum cost consideration
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Network coding theory part II: multiple source
Communications and Information Theory
Partitioning multi-dimensional sets in a small number of "Uniform" parts
European Journal of Combinatorics
Secret Sharing and Non-Shannon Information Inequalities
TCC '09 Proceedings of the 6th Theory of Cryptography Conference on Theory of Cryptography
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 4
The entropy power of a sum is fractionally superadditive
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
An impossibility result on graph secret sharing
Designs, Codes and Cryptography
A recursive construction of the set of binary entropy vectors
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Violating the Ingleton inequality with finite groups
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Matroids can be far from ideal secret sharing
TCC'08 Proceedings of the 5th conference on Theory of cryptography
Stability of properties of Kolmogorov complexity under relativization
Problems of Information Transmission
Information inequalities for joint distributions, with interpretations and applications
IEEE Transactions on Information Theory
New inequalities for subspace arrangements
Journal of Combinatorial Theory Series A
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
Secret-sharing schemes: a survey
IWCC'11 Proceedings of the Third international conference on Coding and cryptology
Finding lower bounds on the complexity of secret sharing schemes by linear programming
LATIN'10 Proceedings of the 9th Latin American conference on Theoretical Informatics
On the optimization of bipartite secret sharing schemes
Designs, Codes and Cryptography
Size and Treewidth Bounds for Conjunctive Queries
Journal of the ACM (JACM)
Finding lower bounds on the complexity of secret sharing schemes by linear programming
Discrete Applied Mathematics
Hi-index | 754.96 |
Given n discrete random variables Ω={X1, ···, Xn}, associated with any subset α of (1, 2, ···, n), there is a joint entropy H(Xα) where Xα={Xi:iεα}. This can be viewed as a function defined on 2/sup {1, 2, ···, n}/ taking values in (0, +∞). We call this function the entropy function of Ω. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function has the following property: for any two subsets α and β of {1, 2, ···, n} HΩ(α)+HΩ(β)⩾H Ω(α∪β)+HΩ (α∩β). These properties are the so-called basic information inequalities of Shannon's information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2n-1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground set {1, 2, ···, n}. Let Γn be the cone in R2n-1 consisting of all vectors which have these three properties when they are viewed as functions defined on 2/sup {1, 2, ···, n}/. Let Γn* be the set of all 2n-1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. The question can be restated as: is it true that for any n, Γ¯n*=Γn? Here Γ¯n* stands for the closure of the set Γn*. The answer is “yes” when n=2 and 3 as proved in our previous work. Based on intuition, one may tend to believe that the answer should be “yes” for any n. The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: Γ¯*n is strictly smaller than Γn whenever n>3. While this new inequality gives a nontrivial outer bound to the cone Γ¯4*, an inner bound for Γ¯*4 is also given. The inequality is also extended to any number of random variables