Information-based complexity
Stochastic properties of quadrature formulas
Numerische Mathematik
Average case complexity of linear multivariate problems I: theory
Journal of Complexity
Average case complexity of linear multivariate problems II: applications
Journal of Complexity
Randomized algorithms
The optimal error of Monte Carlo integration
Journal of Complexity
Complexity and information
Local search heuristic for k-median and facility location problems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Bounded Geometries, Fractals, and Low-Distortion Embeddings
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
Sensor Selection for Minimizing Worst-Case Prediction Error
IPSN '08 Proceedings of the 7th international conference on Information processing in sensor networks
Hi-index | 0.00 |
We study the problem of estimating the average of a Lipschitz continuous function f defined over a metric space, by querying f at only a single point. More specifically, we explore the role of randomness in drawing this sample. Our goal is to find a distribution minimizing the expected estimation error against an adversarially chosen Lipschitz continuous function. Our work falls into the broad class of estimating aggregate statistics of a function from a small number of carefully chosen samples. The general problem has a wide range of practical applications in areas such as sensor networks, social sciences and numerical analysis. However, traditional work in numerical analysis has focused on asymptotic bounds, whereas we are interested in the best algorithm. For arbitrary discrete metric spaces of bounded doubling dimension, we obtain a PTAS for this problem. In the special case when the points lie on a line, the running time improves to an FPTAS. For Lipschitz-continuous functions over [0, 1], we calculate the precise achievable error as 1 - √3-2, which improves upon the 1/4 which is best possible for deterministic algorithms.