On kernel difference-weighted k-nearest neighbor classification

  • Authors:
  • Wangmeng Zuo;David Zhang;Kuanquan Wang

  • Affiliations:
  • Harbin Institute of Technology, School of Computer Science and Technology, 150001, Harbin, China;Hong Kong Polytechnic University, Biometrics Research Centre, Department of Computing, 150001, Kowloon, Hong Kong;Harbin Institute of Technology, School of Computer Science and Technology, 150001, Harbin, China

  • Venue:
  • Pattern Analysis & Applications - Special Issue: Non-parametric distance-based classification techniques and their applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Nearest neighbor (NN) rule is one of the simplest and the most important methods in pattern recognition. In this paper, we propose a kernel difference-weighted k-nearest neighbor (KDF-KNN) method for pattern classification. The proposed method defines the weighted KNN rule as a constrained optimization problem, and we then propose an efficient solution to compute the weights of different nearest neighbors. Unlike traditional distance-weighted KNN which assigns different weights to the nearest neighbors according to the distance to the unclassified sample, difference-weighted KNN weighs the nearest neighbors by using both the correlation of the differences between the unclassified sample and its nearest neighbors. To take into account the effective nonlinear structure information, we further extend difference-weighted KNN to its kernel version KDF-KNN. Our experimental results indicate that KDF-WKNN is much better than the original KNN and the distance-weighted KNN methods, and is comparable to or better than several state-of-the-art methods in terms of classification accuracy.