An estimate of an upper bound for the entropy of English
Computational Linguistics
The entropy of English using PPM-based models
DCC '96 Proceedings of the Conference on Data Compression
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Compression Performance, Absolutely!
DCC '02 Proceedings of the Data Compression Conference
Deterministic Chaos and Information Theory
DCC '01 Proceedings of the Data Compression Conference
Deterministic Complexity and Entropy
Fundamenta Informaticae - Contagious Creativity - In Honor of the 80th Birthday of Professor Solomon Marcus
A differential equation method to derive the formulas of the T-complexity and the LZ-complexity
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Deterministic Complexity and Entropy
Fundamenta Informaticae - Contagious Creativity - In Honor of the 80th Birthday of Professor Solomon Marcus
Hi-index | 0.00 |
Modern information theory is founded on the ideas of Hartley and Shannon, amongst others. From a practitioner's standpoint, Shannon's probabilistic framework carries certain impediments for the practical measurement of information, such as requiring a priori knowledge of a source's characteristics. Moreover, such a statistical formulation of entropy is an asymptotic limit, meaningful only within the context of an ensemble of messages. It thus fails to address the notion of an individual string having information content in and of itself.However, in 1953, Cherry demonstrated that Shannon's entropy could be viewed equivalently as a measure of the average number of selections required identifying each message symbol from the alphabet. Here the terminology contrasts with Shannon's probabilistic formulation, with the process of counting selection steps appearing to be meaningful for individual, isolated, finite strings.We explore this alternative approach in the context of a recursive hierarchical pattern-copying (RHPC) algorithm. We use to measure the complexity of finite strings, in terms of the number of steps required to recursively construct the string from its alphabet. From this we compute an effective rate of steps-per-symbol required for linearly constructing the string.By Cherry's interpretation of Shannon's entropy, we infer this as giving asymptotic equivalence between the two approaches, but perhaps the real significance of this new way to measure information, is its applicability and usefulness in respect of evaluating individual finite strings.