Estimating entropy and entropy norm on data streams

  • Authors:
  • Amit Chakrabarti;Khanh Do Ba;S. Muthukrishnan

  • Affiliations:
  • Department of Computer Science, Dartmouth College, Hanover, NH;Department of Computer Science, Dartmouth College, Hanover, NH;Department of Computer Science, Rutgers University, Piscataway, NJ

  • Venue:
  • STACS'06 Proceedings of the 23rd Annual conference on Theoretical Aspects of Computer Science
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of computing information theoretic functions such as entropy on a data stream, using sublinear space. Our first result deals with a measure we call the “entropy norm” of an input stream: it is closely related to entropy but is structurally similar to the well-studied notion of frequency moments. We give a polylogarithmic space one-pass algorithm for estimating this norm under certain conditions on the input stream. We also prove a lower bound that rules out such an algorithm if these conditions do not hold. Our second group of results are for estimating the empirical entropy of an input stream. We first present a sublinear space one-pass algorithm for this problem. For a stream of m items and a given real parameter α, our algorithm uses space $\tilde{O}(m^{2\alpha})$ and provides an approximation of 1/α in the worst case and (1+ε) in “most” cases. We then present a two-pass polylogarithmic space (1+ε)-approximation algorithm. All our algorithms are quite simple.