Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Coding on countably infinite alphabets
IEEE Transactions on Information Theory
Rényi entropy rate for Gaussian processes
Information Sciences: an International Journal
On the discontinuity of the Shannon information measures
IEEE Transactions on Information Theory
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
On the interplay between conditional entropy and error probability
IEEE Transactions on Information Theory
Renyi's entropy and the probability of error
IEEE Transactions on Information Theory
Universal Coding on Infinite Alphabets: Exponentially Decreasing Envelopes
IEEE Transactions on Information Theory
Generalized cutoff rates and Renyi's information measures
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We study certain properties of Rényi entropy functionals $$H_\alpha \left( \mathcal{P} \right)$$ on the space of probability distributions over 驴+. Primarily, continuity and convergence issues are addressed. Some properties are shown to be parallel to those known in the finite alphabet case, while others illustrate a quite different behavior of the Rényi entropy in the infinite case. In particular, it is shown that for any distribution $$\mathcal{P}$$ and any r 驴 [0,驴] there exists a sequence of distributions $$\mathcal{P}_n$$ converging to $$\mathcal{P}$$ with respect to the total variation distance and such that $$\mathop {\lim }\limits_{n \to \infty } \mathop {\lim }\limits_{\alpha \to 1 + } H_\alpha \left( {\mathcal{P}_n } \right) = \mathop {\lim }\limits_{\alpha \to 1 + } \mathop {\lim }\limits_{n \to \infty } H_\alpha \left( {\mathcal{P}_n } \right) + r$$ .