Asymptotic slowing down of the nearest-neighbor classifier
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
A Symmetric Nearest Neighbor Learning Rule
EWCBR '00 Proceedings of the 5th European Workshop on Advances in Case-Based Reasoning
On the Rate of Convergence of the Bagged Nearest Neighbor Estimate
The Journal of Machine Learning Research
Hi-index | 0.00 |
The finite sample performance of a nearest neighbor classifier is analyzed for a two-class pattern recognition problem. An exact integral expression is derived for the m-sample risk Rm given that a reference m-sample of labeled points, drawn independently from Euclidean n-space according to a fixed probability distri bution, is available to the classifier. For a family of smooth distributions, it is shown that the m-sample risk Rm has a complete asymptotic expansion ******, where ** denotes the nearest neighbor risk in the infinite sample limit. Explicit definitions of the expansion coefficents are given in terms of the underlying distribution. As the convergence rate of **** dramatically slows down as n increases, this analysis provides an analytic validation of Bellman's curse of dimensionality. Numerical simulations corroborating the formal results are included. The rates of convergence for less restrictive families of distributions are also discussed.