Optimal adaptive algorithms for finding the nearest and farthest point on a parametric black-box curve

  • Authors:
  • Ilya Baran;Erik D. Demaine

  • Affiliations:
  • MIT, Cambridge, MA;MIT, Cambridge, MA

  • Venue:
  • SCG '04 Proceedings of the twentieth annual symposium on Computational geometry
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider a general model for representing and manipulating parametric curves, in which a curve is specified by a black box mapping a parameter value between 0 and 1 to a point in Euclidean d-space. In this model, we consider the nearest-point-on-curve and farthest-point-on-curve problems: given a curve C and a point p, find a point on C nearest to p or farthest from p. In the general black-box model, no algorithm can solve these problems. Assuming a known bound on the speed of the curve (a Lipschitz condition),the answer can be estimated up to an additive error of ε using O(1/ε) samples, and this bound is tight in theworst case. However, many instances can be solved with substantially fewer samples, and we give algorithms that adapt to the inherent difficulty of the particular instance, up to a logarithmic factor. More precisely, if OPT(C,p,ε)is the minimum number of samples of C that every correct algorithm must perform to achieve tolerance ε, then our algorithm performs O(OPT(C,p,ε)log(ε-1/OPT(C,p,ε))) samples. Furthermore, any algorithm requires Ω(klog(ε-1/k))samples for some instance C' with OPT(C',p,ε) = k; except that, for the nearest-point-on-curve problem when the distance between C and p is less than ε, OPT is 1 but the upper and lower bounds on the number of samples are both Θ(1/ε). When bounds on relative error are desired, we give algorithms that perform O(OPT log(2+(1+ε-1)) m-1/OPT))samples (where m is the exact minimum or maximum distance from p to C) and prove that Ω(OPT log(1/ε)) samples are necessary on some problem instances.