On the Behavior of Artificial Neural Network Classifiers in High-Dimensional Spaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Lower Bounds for Bayes Error Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fuzzy Sets and Systems - Featured Issue: Selected papers from ACIDCA 2000
Pattern Recognition Letters
Estimating Sales Opportunity Using Similarity-Based Methods
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Local Subspace Classifier with Transform-Invariance for Image Classification
IEICE - Transactions on Information and Systems
Overlap pattern synthesis with an efficient nearest neighbor classifier
Pattern Recognition
Hi-index | 0.14 |
The bias of the finite-sample nearest neighbor (NN) error from its asymptotic value is examined. Expressions are obtained which relate the bias of the NN and 2-NN errors to sample size, dimensionality, metric, and distributions. These expressions isolate the effect of sample size from that of the distributions, giving an explicit relation showing how the bias changes as the sample size is increased. Experimental results are given which suggest that the expressions accurately predict the bias. It is shown that when the dimensionality of the data is high, it may not be possible to estimate the asymptotic error simply by increasing the sample size. A new procedure is suggested to alleviate this problem. This procedure involves measuring the mean NN errors at several sample sizes and using our derived relationship between the bias and the sample size to extrapolate an estimate of the asymptotic NN error. The results are extended to the multiclass problem. The choice of an optimal metric to minimize the bias is also discussed.