Information-based complexity
On the number of iterations of Piyavskii's global optimization algorithm
Mathematics of Operations Research
Adaptive set intersections, unions, and differences
SODA '00 Proceedings of the eleventh annual ACM-SIAM symposium on Discrete algorithms
Optimal aggregation algorithms for middleware
Journal of Computer and System Sciences - Special issu on PODS 2001
Finding hidden independent sets in interval graphs
Theoretical Computer Science
SCG '04 Proceedings of the twentieth annual symposium on Computational geometry
Hi-index | 0.00 |
We consider the problem of approximately integrating a Lipschitz function f (with a known Lipschitz constant) over an interval. The goal is to achieve an error of at most ε using as few samples of f as possible. We use the adaptive framework: on all problem instances an adaptive algorithm should perform almost as well as the best possible algorithm tuned for the particular problem instance. We distinguish between DOPT and ROPT, the performances of the best possible deterministic and randomized algorithms, respectively. We give a deterministic algorithm that uses O(DOPT(f,ε) · log(ε−1/DOPT(f,ε))) samples and show that an asymptotically better algorithm is impossible. However, any deterministic algorithm requires Ω(ROPT(f,ε)2) samples on some problem instance. By combining a deterministic adaptive algorithm and Monte Carlo sampling with variance reduction, we give an algorithm that uses at most O(ROPT(f,ε)4/3+ROPT(f,ε) ·log(1/ε)) samples. We also show that any algorithm requires Ω(ROPT(f,ε)4/3+ROPT(f,ε) ·log(1/ε)) samples in expectation on some problem instance (f,ε), which proves that our algorithm is optimal.