Elements of information theory
Elements of information theory
Some new statistics for testing hypotheses in parametric models
Journal of Multivariate Analysis
Formulas for Rényi information and related measures for univariate distributions
Information Sciences: an International Journal
Some results on generalized residual entropy
Information Sciences: an International Journal
On the entropy of continuous probability distributions (Corresp.)
IEEE Transactions on Information Theory
On the convexity of some divergence measures based on entropy functions
IEEE Transactions on Information Theory
On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
Results on residual Rényi entropy of order statistics and record values
Information Sciences: an International Journal
On uncertainty and information properties of ranked set samples
Information Sciences: an International Journal
Hi-index | 0.07 |
Recently, in the literature, a measure of tail heaviness has been proposed based on Renyi entropy. This measure is very useful in the sense that it can be used to measure tail heaviness even for the distributions for which the kurtosis measure @b"2=@m"4/@m"2^2 does not exist. Nadarajah and Zografos [Nadarajah, Zografos, Information Sciences 153 (2003), 119-138] have derived the expression for this measure for different univariate continuous distributions. But, this measure can only be used for the lifetime of a new item. In case of used item, this measure needs some modification. In this paper, we have modified the measure accordingly so that it can be used in the case of used item and also for the new item. We have also derived expressions for this measure for sixteen different univariate distributions and ten other standard distributions derived from the general distributions used in reliability and survival analysis. The most of the results obtained in the literature in this direction can be obtained as particular cases of our general results.