Fully sequential procedures in nonlinear design problems
Computational Statistics & Data Analysis - Optimal design and analysis of experiments
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Entropy expressions for multivariate continuous distributions
IEEE Transactions on Information Theory
Hi-index | 754.84 |
For α ≥ 1, the new Vajda-type information measure Jα(X) is a quantity generalizing Fisher's information (FI), to which it is reduced for α=2. In this paper, a corresponding generalized entropy power Nα(X) is introduced, and the inequality Nα(X)Jα(X) ≥ n is proved, which is reduced to the well-known inequality of Stam for α=2. The cases of equality are also determined. Furthermore, the Blachman-Stam inequality for the FI of convolutions is generalized for the Vajda information Jα(X) and both families of results in the context of measure of information are discussed. That is, logarithmic Sobolev inequalities (LSIs) are written in terms of new more general entropy-type information measure, and therefore, new information inequalities are arisen. This generalization for special cases yields to the well known information measures and relative bounds.