Kernel Difference-Weighted k-Nearest Neighbors Classification

  • Authors:
  • Wangmeng Zuo;Kuanquan Wang;Hongzhi Zhang;David Zhang

  • Affiliations:
  • School of Computer Science and Technology, Harbin Institute of Technology, 150001 Harbin, China;School of Computer Science and Technology, Harbin Institute of Technology, 150001 Harbin, China;School of Computer Science and Technology, Harbin Institute of Technology, 150001 Harbin, China;Department of Computing, the Hong Kong Polytechnic University, Kowloon, Hong Kong

  • Venue:
  • ICIC '07 Proceedings of the 3rd International Conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Nearest Neighbor (NN) rule is one of the simplest and most important methods in pattern recognition. In this paper, we propose a kernel difference-weighted k-nearest neighbor method (KDF-WKNN) for pattern classification. The proposed method defines the weighted KNN rule as a constrained optimization problem, and then we propose an efficient solution to compute the weights of different nearest neighbors. Unlike distance-weighted KNN which assigns different weights to the nearest neighbors according to the distance to the unclassified sample, KDF-WKNN weights the nearest neighbors by using both the norm and correlation of the differences between the unclassified sample and its nearest neighbors. Our experimental results indicate that KDF-WKNN is better than the original KNN and distance-weighted KNN, and is comparable to some state-of-the-art methods in terms of classification accuracy.