Conditional independence structures examined via minors
Annals of Mathematics and Artificial Intelligence
An outer bound for multisource multisink network coding with minimum cost consideration
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Network coding theory part II: multiple source
Communications and Information Theory
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 4
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
Hi-index | 754.90 |
We present a framework for information inequalities, namely, inequalities involving only Shannon's information measures, for discrete random variables. A region in IR(2n-1), denoted by Γ*, is identified to be the origin of all information inequalities involving n random variables in the sense that all such inequalities are partial characterizations of Γ*. A product from this framework is a simple calculus for verifying all unconstrained and constrained linear information identities and inequalities which can be proved by conventional techniques. These include all information identities and inequalities of such types in the literature. As a consequence of this work, most identities and inequalities involving a definite number of random variables can now be verified by a software called ITIP which is available on the World Wide Web. Our work suggests the possibility of the existence of information inequalities which cannot be proved by conventional techniques. We also point out the relation between Γ* and some important problems in probability theory and information theory