Elements of information theory
Elements of information theory
On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
On the entropy of a hidden Markov process
Theoretical Computer Science
On Rényi information for ergodic diffusion processes
Information Sciences: an International Journal
On the geometry of generalized Gaussian distributions
Journal of Multivariate Analysis
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Gaussian Mixture Modeling of Short-Time Fourier Transform Features for Audio Fingerprinting
IEEE Transactions on Information Forensics and Security
Csiszar's cutoff rates for arbitrary discrete sources
IEEE Transactions on Information Theory
Reversed version of a generalized sharp Hölder's inequality and its applications
Information Sciences: an International Journal
Algorithmic superactivation of asymptotic quantum capacity of zero-capacity quantum channels
Information Sciences: an International Journal
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 0.07 |
In this paper, we introduce the definition of the conditional Renyi entropy for continuous random variables and show that the so-called chain rule holds. Then, we use this rule to obtain another relation for getting the rate of Renyi entropy. Using this relation and properties of the Renyi entropy we obtain the Renyi entropy rate for stationary Gaussian processes. Finally, we show that the bound for the Renyi entropy rate is simply the Shannon entropy rate and that the Renyi entropy rate reduces to the Shannon entropy rate as @a-1.