Voronoi diagrams—a survey of a fundamental geometric data structure
ACM Computing Surveys (CSUR)
A Simple Algorithm for Nearest Neighbor Search in High Dimensions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Approximate nearest neighbors: towards removing the curse of dimensionality
STOC '98 Proceedings of the thirtieth annual ACM symposium on Theory of computing
Multidimensional binary search trees used for associative searching
Communications of the ACM
The Anchors Hierarchy: Using the Triangle Inequality to Survive High Dimensional Data
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Inference for the Generalization Error
Machine Learning
Locality-sensitive hashing scheme based on p-stable distributions
SCG '04 Proceedings of the twentieth annual symposium on Computational geometry
Cover trees for nearest neighbor
ICML '06 Proceedings of the 23rd international conference on Machine learning
Mining Arbitrarily Large Datasets Using Heuristic k-Nearest Neighbour Search
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Comparison of nearest point algorithms by genetic algorithms
Expert Systems with Applications: An International Journal
Quantum speed-up for unsupervised learning
Machine Learning
Hi-index | 0.00 |
Nearest neighbour search (NNS) is an old problem that is of practical importance in a number of fields. It involves finding, for a given point q, called the query, one or more points from a given set of points that are nearest to the query q. Since the initial inception of the problem a great number of algorithms and techniques have been proposed for its solution. However, it remains the case that many of the proposed algorithms have not been compared against each other on a wide variety of datasets. This research attempts to fill this gap to some extent by presenting a detailed empirical comparison of three prominent data structures for exact NNS: KD-Trees, Metric Trees, and Cover Trees. Our results suggest that there is generally little gain in using Metric Trees or Cover Trees instead of KD-Trees for the standard NNS problem.