Optimally adaptive integration of univariate lipschitz functions

  • Authors:
  • Ilya Baran;Erik D. Demaine;Dmitriy A. Katz

  • Affiliations:
  • MIT Computer Science and Artificial Intelligence Laboratory, Cambridge, MA;MIT Computer Science and Artificial Intelligence Laboratory, Cambridge, MA;Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA

  • Venue:
  • LATIN'06 Proceedings of the 7th Latin American conference on Theoretical Informatics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of approximately integrating a Lipschitz function f (with a known Lipschitz constant) over an interval. The goal is to achieve an error of at most ε using as few samples of f as possible. We use the adaptive framework: on all problem instances an adaptive algorithm should perform almost as well as the best possible algorithm tuned for the particular problem instance. We distinguish between DOPT and ROPT, the performances of the best possible deterministic and randomized algorithms, respectively. We give a deterministic algorithm that uses O(DOPT(f,ε) · log(ε−1/DOPT(f,ε))) samples and show that an asymptotically better algorithm is impossible. However, any deterministic algorithm requires Ω(ROPT(f,ε)2) samples on some problem instance. By combining a deterministic adaptive algorithm and Monte Carlo sampling with variance reduction, we give an algorithm that uses at most O(ROPT(f,ε)4/3+ROPT(f,ε) ·log(1/ε)) samples. We also show that any algorithm requires Ω(ROPT(f,ε)4/3+ROPT(f,ε) ·log(1/ε)) samples in expectation on some problem instance (f,ε), which proves that our algorithm is optimal.