Elements of information theory
Elements of information theory
The complexity of approximating entropy
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Convergence properties of functional estimates for discrete distributions
Random Structures & Algorithms - Special issue on analysis of algorithms dedicated to Don Knuth on the occasion of his (100)8th birthday
Estimation of entropy and mutual information
Neural Computation
Universal entropy estimation via block sorting
IEEE Transactions on Information Theory
Estimating entropy on m bins given fewer than m samples
IEEE Transactions on Information Theory
Testing Symmetric Properties of Distributions
SIAM Journal on Computing
Testing Closeness of Discrete Distributions
Journal of the ACM (JACM)
Hi-index | 0.00 |
We consider the problem of approximating the entropy of a discrete distribution P on a domain of size q, given access to n independent samples from the distribution. It is known that n ≥ q is necessary, in general, for a good additive estimate of the entropy. A problem of multiplicative entropy estimate was recently addressed by Batu, Dasgupta, Kumar, and Rubinfeld. They show that n = qα suffices for a factor-α approximation, α 1. We introduce a new parameter of a distribution--its effective alphabet size qef(P). This is a more intrinsic property of the distribution depending only on its entropy moments. We show qef ≤ Õ(q). When the distribution P is essentially concentrated on a small part of the domain qef ≪ q. We strengthen the result of Batu et al. by showing it holds with qef replacing q. This has several implications. In particular the rate of convergence of the maximum-likelihood entropy estimator (the empirical entropy) for both finite and infinite alphabets is shown to be dictated by the effective alphabet size of the distribution. Several new, and some known, facts about this estimator follow easily. Our main result is algorithmic. Though the effective alphabet size is, in general, an unknown parameter of the distribution, we give an efficient procedure, with access to the alphabet size only, that achieves a factor-α approximation of the entropy with n = Õ (exp {α 1/4 · log3/4 q · log1/4 qef}). Assuming (for instance) log qef ≪ log q this is smaller than any power of q. Taking α → 1 leads in this case to efficient additive estimates for the entropy as well. In particular, this result shows that for many natural scenarios, a tight estimation of the entorpy may be achieved using a sub-linear sample. Several extensions of the results above are discussed.