Information-based complexity
The arc tree: an approximation scheme to represent arbitrary curved shapes
Computer Vision, Graphics, and Image Processing
On the number of iterations of Piyavskii's global optimization algorithm
Mathematics of Operations Research
Interval analysis for computer graphics
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
A survey of adaptive sorting algorithms
ACM Computing Surveys (CSUR)
Kinetic collision detection between two simple polygons
Proceedings of the tenth annual ACM-SIAM symposium on Discrete algorithms
Adaptive set intersections, unions, and differences
SODA '00 Proceedings of the eleventh annual ACM-SIAM symposium on Discrete algorithms
High-Level Filtering for Arrangements of Conic Arcs
ESA '02 Proceedings of the 10th Annual European Symposium on Algorithms
A Computational Basis for Conic Arcs and Boolean Operations on Conic Polygons
ESA '02 Proceedings of the 10th Annual European Symposium on Algorithms
Efficient Distance Computation for Quadratic Curves and Surfaces
GMP '02 Proceedings of the Geometric Modeling and Processing — Theory and Applications (GMP'02)
Optimal aggregation algorithms for middleware
Journal of Computer and System Sciences - Special issu on PODS 2001
Algebraic methods and arithmetic filtering for exact predicates on circle arcs
Computational Geometry: Theory and Applications
Optimally adaptive integration of univariate lipschitz functions
LATIN'06 Proceedings of the 7th Latin American conference on Theoretical Informatics
Hi-index | 0.00 |
We consider a general model for representing and manipulating parametric curves, in which a curve is specified by a black box mapping a parameter value between 0 and 1 to a point in Euclidean d-space. In this model, we consider the nearest-point-on-curve and farthest-point-on-curve problems: given a curve C and a point p, find a point on C nearest to p or farthest from p. In the general black-box model, no algorithm can solve these problems. Assuming a known bound on the speed of the curve (a Lipschitz condition),the answer can be estimated up to an additive error of ε using O(1/ε) samples, and this bound is tight in theworst case. However, many instances can be solved with substantially fewer samples, and we give algorithms that adapt to the inherent difficulty of the particular instance, up to a logarithmic factor. More precisely, if OPT(C,p,ε)is the minimum number of samples of C that every correct algorithm must perform to achieve tolerance ε, then our algorithm performs O(OPT(C,p,ε)log(ε-1/OPT(C,p,ε))) samples. Furthermore, any algorithm requires Ω(klog(ε-1/k))samples for some instance C' with OPT(C',p,ε) = k; except that, for the nearest-point-on-curve problem when the distance between C and p is less than ε, OPT is 1 but the upper and lower bounds on the number of samples are both Θ(1/ε). When bounds on relative error are desired, we give algorithms that perform O(OPT log(2+(1+ε-1)) m-1/OPT))samples (where m is the exact minimum or maximum distance from p to C) and prove that Ω(OPT log(1/ε)) samples are necessary on some problem instances.