Some intersection theorems for ordered sets and graphs
Journal of Combinatorial Theory Series A
Elements of information theory
Elements of information theory
Introduction to matrix analysis (2nd ed.)
Introduction to matrix analysis (2nd ed.)
Combinatorial auctions with decreasing marginal utilities
Proceedings of the 3rd ACM conference on Electronic Commerce
An Entropy Approach to the Hard-Core Model on Bipartite Graphs
Combinatorics, Probability and Computing
On maximizing welfare when utility functions are subadditive
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
Cores of cooperative games in information theory
EURASIP Journal on Wireless Communications and Networking - Theory and Applications in Multiuser/Multiterminal Communications
On characterization of entropy function via information inequalities
IEEE Transactions on Information Theory
Balanced information inequalities
IEEE Transactions on Information Theory
Zero-Error Source–Channel Coding With Side Information
IEEE Transactions on Information Theory
Two Constructions on Limits of Entropy Functions
IEEE Transactions on Information Theory
Networks, Matroids, and Non-Shannon Information Inequalities
IEEE Transactions on Information Theory
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
IEEE Transactions on Information Theory
Cores of cooperative games in information theory
EURASIP Journal on Wireless Communications and Networking - Theory and Applications in Multiuser/Multiterminal Communications
Note: Matchings and independent sets of a fixed size in regular graphs
Journal of Combinatorial Theory Series A
The number of independent sets in a regular graph
Combinatorics, Probability and Computing
Submodular functions are noise stable
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Entropy and set cardinality inequalities for partition-determined functions
Random Structures & Algorithms
Hi-index | 754.84 |
Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon's chain rule for entropy as well as inequalities of Han, Fujishige, and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions--thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as determinantal inequalities in matrix theory. A general inequality for relative entropies is also developed. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.