Weighting of the k-Nearest-Neighbors

  • Authors:
  • Konstantin Chernoff;Mads Nielsen

  • Affiliations:
  • -;-

  • Venue:
  • ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents two distribution independent weighting schemes for k-Nearest-Neighbors (kNN). Applying the first scheme in a Leave-One-Out (LOO) setting corresponds to performing complete b-fold cross validation (b-CCV), while applying the second scheme corresponds to performing bootstrapping in the limit of infinite iterations. We demonstrate that the soft kNN errors obtained through b-CCV can be obtained by applying the weighted kNN in a LOO setting, and that the proposed weighting schemes can decrease the variance and improve the generalization of kNN in a CV setting.