Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Entropy and information theory
Entropy and information theory
Renyi's entropy as an index of diversity in simple-stage cluster sampling
Information Sciences: an International Journal
Computing conditional probabilities in large domains by maximizing renyi's quadratic entropy
Computing conditional probabilities in large domains by maximizing renyi's quadratic entropy
Expressions for Rényi and Shannon entropies for bivariate distributions
Information Sciences—Informatics and Computer Science: An International Journal
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Tsallis differential entropy and divergences derived from the generalized Shannon-Khinchin axioms
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
EvoCOMNET'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part II
Hi-index | 0.07 |
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem [M.S. Pinsker, Information and Information Stability of Random Variables and Process, 1960, Holden-Day, San Francisco, CA (English ed., 1964, translated and edited by Amiel Feinstein), Theorem. 2.4.2] which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Renyi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.