Estimation of entropy and mutual information
Neural Computation
Context weighting for general finite-context sources
IEEE Transactions on Information Theory
The context-tree weighting method: extensions
IEEE Transactions on Information Theory
Universal entropy estimation via block sorting
IEEE Transactions on Information Theory
Universal discrete denoising: known channel
IEEE Transactions on Information Theory
Universal Divergence Estimation for Finite-Alphabet Sources
IEEE Transactions on Information Theory
An Algorithm for Universal Lossless Compression With Side Information
IEEE Transactions on Information Theory
Schemes for Bidirectional Modeling of Discrete Stationary Sources
IEEE Transactions on Information Theory
The Information Lost in Erasures
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Hi-index | 754.84 |
Erasure entropy rate differs from Shannon's entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended context-tree weighting (CTW) algorithms. Simulation results for those algorithms applied to Markov sources, tree sources, and English texts are compared to those obtained by fixed-order plug-in estimators with different orders.