On the geometry of generalized Gaussian distributions
Journal of Multivariate Analysis
Maximum Entropy Principle with General Deviation Measures
Mathematics of Operations Research
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Zero-Entropy minimization for blind extraction of bounded sources (BEBS)
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Information-theoretic clustering: A representative and evolutionary approach
Expert Systems with Applications: An International Journal
Hi-index | 754.90 |
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Crame´r-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Renyi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Crame´r-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.