An application of Shannon's coding theorem to information transmission in economic markets
Information Sciences: an International Journal
Some properties of generalized exponential entropies with applications to data compression
Information Sciences: an International Journal
Application of Holder's inequality in information theory
Information Sciences: an International Journal
A quantitative-qualitative measure of information in cybernetic systems (Corresp.)
IEEE Transactions on Information Theory
Some equivalences between Shannon entropy and Kolmogorov complexity
IEEE Transactions on Information Theory
Tight lower bounds for optimum code length (Corresp.)
IEEE Transactions on Information Theory
Minimum Cost Distributed Source Coding Over a Network
IEEE Transactions on Information Theory
Redundancy-Related Bounds for Generalized Huffman Codes
IEEE Transactions on Information Theory
Hi-index | 0.07 |
Two new mean codeword lengths L(@a,@b) and L(@b) are defined and it is shown that these lengths satisfy desirable properties as a measure of typical codeword lengths. Consequently two new noiseless coding theorems subject to Kraft's inequality have been proved. Further, we have shown that the mean codeword lengths L"1":"1(@a,@b) and L"1":"1(@b) for the best one-to-one code (not necessarily uniquely decodable) are shorter than the mean codeword length L"U"D(@a,@b) and L"U"D(@b) respectively for the best uniquely decodable code by no more than log"Dlog"Dn+3 for D=2. Moreover, we have studied tighter bounds of L(@a,@b).