Elements of information theory
Elements of information theory
Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
Information Sciences: an International Journal
On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
On the entropy of a hidden Markov process
Theoretical Computer Science
On Rényi information for ergodic diffusion processes
Information Sciences: an International Journal
On the geometry of generalized Gaussian distributions
Journal of Multivariate Analysis
Renyi's divergence and entropy rates for finite alphabet Markov sources
IEEE Transactions on Information Theory
On the Estimation of Differential Entropy From Data Located on Embedded Manifolds
IEEE Transactions on Information Theory
Generalized cutoff rates and Renyi's information measures
IEEE Transactions on Information Theory
Rényi entropy rate for Gaussian processes
Information Sciences: an International Journal
The Pólya information divergence
Information Sciences: an International Journal
Results on residual Rényi entropy of order statistics and record values
Information Sciences: an International Journal
Divergence statistics for testing uniform association in cross-classifications
Information Sciences: an International Journal
Unsupervised host behavior classification from connection patterns
International Journal of Network Management
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 0.07 |
In this paper, we define the conditional Renyi entropy and show that the so-called chain rule holds for the Renyi entropy. Then, we introduce a relation for the rate of Renyi entropy and use it to derive the rate of the Renyi entropy for an irreducible-aperiodic Markov chain. We also show that the bound for the Renyi entropy rate is simply the Shannon entropy rate.