Elements of information theory
Elements of information theory
Information theory and statistics: a tutorial
Communications and Information Theory
Adaptive estimated maximum-entropy distribution model
Information Sciences: an International Journal
Viewpoint-based simplification using f-divergences
Information Sciences: an International Journal
On some entropy functionals derived from Rényi information divergence
Information Sciences: an International Journal
Polya Urn Models
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
On minimum Fisher information distributions with restricted support and fixed variance
Information Sciences: an International Journal
Entropy and Distance of Random Graphs with Application to Structural Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
The method of types [information theory]
IEEE Transactions on Information Theory
Maximum entropy and conditional probability
IEEE Transactions on Information Theory
Derivation of an amplitude of information in the setting of a new family of fractional entropies
Information Sciences: an International Journal
Algorithmic superactivation of asymptotic quantum capacity of zero-capacity quantum channels
Information Sciences: an International Journal
Hi-index | 0.07 |
Extensions of Sanov's Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Polya-Eggenberger (PE) urn sampling scheme, giving the Polya information divergence and a Polya extension to the Maximum Relative Entropy (MaxEnt) method. Polya MaxEnt includes the standard MaxEnt, as well as its variants used in Bose-Einstein, Fermi-Dirac and intermediate (Acharya-Swamy) statistics, as special cases. In the PE setting, standard MaxEnt is, in general, asymptotically inconsistent.