Sampling lower bounds via information theory

  • Authors:
  • Ziv Bar-Yossef

  • Affiliations:
  • IBM Almaden Research Center, San Jose, CA

  • Venue:
  • Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel technique, based on the Jensen-Shannon divergence from information theory, to prove lower bounds on the query complexity of sampling algorithms that approximate functions over arbitrary domain and range. Unlike previous methods, our technique does not use a reduction from a decision promise problem. As a result, it gives stronger bounds for functions that possess a large set of inputs, each two of which exhibit a gap in the function value.We demonstrate the technique with new query complexity lower bounds for three fundamental problems: (1) the "election problem", for which we obtain a quadratic improvement over previous bounds, (2) low rank matrix approximation, for which we prove the first lower bounds, showing that the algorithms given for this problem are almost optimal, and (3) matrix reconstruction.In addition, we introduce a new method for proving lower bounds on the expected query complexity of functions, using the Kullback-Leibler divergence. We demonstrate its use by a simple query complexity lower bound for the mean.