Elements of information theory
Elements of information theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity
Problems of Information Transmission
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Jaynes's entropy concentration theorem states that, for most words 驴1 ...驴N of length N such that $$\mathop \Sigma \limits_{i = 1}^{\rm N} \;f(\omega _i ) \approx vN$$ , empirical frequencies of values of a function f are close to the probabilities that maximize the Shannon entropy given a value v of the mathematical expectation of f. Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes's concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on uctuations of energy levels at equilibrium points.