Elements of information theory
Elements of information theory
Entropy expressions for multivariate continuous distributions
IEEE Transactions on Information Theory
Expressions for Rényi and Shannon entropies for bivariate distributions
Information Sciences—Informatics and Computer Science: An International Journal
Rényi information measure for a used item
Information Sciences: an International Journal
On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
On Rényi information for ergodic diffusion processes
Information Sciences: an International Journal
Results on residual Rényi entropy of order statistics and record values
Information Sciences: an International Journal
Grid based variational approximations
Computational Statistics & Data Analysis
Generalized beta-generated distributions
Computational Statistics & Data Analysis
The gamma-normal distribution: Properties and applications
Computational Statistics & Data Analysis
On uncertainty and information properties of ranked set samples
Information Sciences: an International Journal
Hi-index | 0.07 |
In a recent paper Song [J. Stat. Plan. Infer. 93 (2001) 51] considered Rényi information of order λ and established its connection to the loglikelihood. From this relation an intrinsic distribution measure was proposed and analytic expressions of this measure and Rényi information were derived for some standard continuous distributions. In this paper, we derive analytical formulas for Rényi and Shannon entropies, as well as, for Song's measure for 26 flexible families of univariate continuous distributions. We believe that the results presented here will serve as an important reference for scientists and engineers in many areas.