Entropy and information theory
Entropy and information theory
Elements of information theory
Elements of information theory
Properties of a Word-Valued Source with a Non-prefix-free Word Set
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Probability, Random Processes, and Ergodic Properties
Probability, Random Processes, and Ergodic Properties
Variable-to-fixed length codes and the conservation of entropy
IEEE Transactions on Information Theory
On the AEP of word-valued sources
IEEE Transactions on Information Theory
Hi-index | 754.84 |
A word-valued source Y = Y1, Y2,... is discrete random process that is formed by sequentially encoding the symbols of a random process X = X1, X2,... with codewords from a codebook C. These processes appear frequently in information theory (in particular, in the analysis of source-coding algorithms), so it is of interest to give conditions on X and C for which Y will satisfy an ergodic theorem and possess an asymptotic equipartition property (AEP). In this paper, we prove the following: 1) if X is asymptotically mean stationary (AMS), then Y will satisfy a pointwise ergodic theorem and possess an AEP; and 2) if the codebook C is prefix-free, then the entropy rate of Y is equal to the entropy rate of X normalized by the average codeword length.