Statistical mechanics of complex networks
Statistical mechanics of complex networks
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
Information, Physics, and Computation
Information, Physics, and Computation
Irreversibility and heat generation in the computing process
IBM Journal of Research and Development
Statistical physics of signal estimation in Gaussian noise: theory and examples of phase transitions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A statistical-mechanics approach to large-system analysis of CDMA multiuser detectors
IEEE Transactions on Information Theory
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
Hi-index | 754.84 |
We provide a simple physical interpretation, in the context of the second law of thermodynamics to the information inequality (a.k.a. the Gibbs inequality, which is also equivalent to the log-sum inequality), asserting that the relative entropy between two probability distributions cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it is observed that conceptually, the roots of fundamental limits of information theory can actually be attributed to the laws of physics, in particular, the second law of thermodynamics, and indirectly, also the law of energy conservation. By the same token, in the other direction: one can view the second law as stemming from information-theoretic principles.