An Information-Theoretic Class of Stochastic Decision Processes
WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 02
An information-spectrum approach to analysis of return maximization in reinforcement learning
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
Stochastic processes for return maximization in reinforcement learning
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Hi-index | 754.84 |
The (noiseless) fixed-length source coding theorem states that, except for outcomes in a set of vanishing probability, a source can be encoded at its entropy but not more efficiently. It is well known that the asymptotic equipartition property (AEP) is a sufficient condition for a source to be encodable at its entropy. This paper shows that the AEP is necessary for the source coding theorem to hold for nonzero-entropy finite-alphabet sources. Furthermore, we show that a nonzero-entropy finite-alphabet source satisfies the direct coding theorem if and only if it satisfies the strong converse. In addition, we introduce the more general setting of nonserial information sources which need not put out strings of symbols. In this context, which encompasses the conventional serial setting, the AEP is equivalent to the validity of the strong coding theorem. Fundamental limits for data compression of nonserial information sources are shown based on the flat-top property-a new sufficient condition for the AEP