Bias of Nearest Neighbor Error Estimates

  • Authors:
  • K. Fukunaga;D. M. Hummels

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1987

Quantified Score

Hi-index 0.14

Visualization

Abstract

The bias of the finite-sample nearest neighbor (NN) error from its asymptotic value is examined. Expressions are obtained which relate the bias of the NN and 2-NN errors to sample size, dimensionality, metric, and distributions. These expressions isolate the effect of sample size from that of the distributions, giving an explicit relation showing how the bias changes as the sample size is increased. Experimental results are given which suggest that the expressions accurately predict the bias. It is shown that when the dimensionality of the data is high, it may not be possible to estimate the asymptotic error simply by increasing the sample size. A new procedure is suggested to alleviate this problem. This procedure involves measuring the mean NN errors at several sample sizes and using our derived relationship between the bias and the sample size to extrapolate an estimate of the asymptotic NN error. The results are extended to the multiclass problem. The choice of an optimal metric to minimize the bias is also discussed.