On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Rényi information dimension: fundamental limits of almost lossless analog compression
IEEE Transactions on Information Theory
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 754.96 |
Renyi's (1961) entropy and divergence of order a are given operational characterizations in terms of block coding and hypothesis testing, as so-called β-cutoff rates, with α=(1+β)-1 for entropy and α=(1-β)-1 for divergence. Out of several possible definitions of mutual information and channel capacity of order α, our approach distinguishes one that admits an operational characterization as β-cutoff rate for channel coding, with α=(1-β)-1. The ordinary cutoff rate of a DMC corresponds to β=-1