Coarse to fine K nearest neighbor classifier

  • Authors:
  • Yong Xu;Qi Zhu;Zizhu Fan;Minna Qiu;Yan Chen;Hong Liu

  • Affiliations:
  • Bio-Computing Research Center, Shenzhen Graduate School, Harbin Institute of Technology, 518055 Shenzhen, China and Key Laboratory of Network Oriented Intelligent Computation, Shenzhen, China;Bio-Computing Research Center, Shenzhen Graduate School, Harbin Institute of Technology, 518055 Shenzhen, China;Bio-Computing Research Center, Shenzhen Graduate School, Harbin Institute of Technology, 518055 Shenzhen, China and School of Basic Science, East China Jiaotong University, Nanchang, Jiangxi, Chin ...;Bio-Computing Research Center, Shenzhen Graduate School, Harbin Institute of Technology, 518055 Shenzhen, China;Bio-Computing Research Center, Shenzhen Graduate School, Harbin Institute of Technology, 518055 Shenzhen, China;Key Laboratory of Machine Perception and Intelligence, Shenzhen Graduate School, Peking University, Shenzhen, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper, we propose a coarse to fine K nearest neighbor (KNN) classifier (CFKNNC). CFKNNC differs from the conventional KNN classifier (CKNNC) as follows: CFKNNC first coarsely determines a small number of training samples that are ''close'' to the test sample and then finely identifies the K nearest neighbors of the test sample. The main difference between CFKNNC and CKNNC is that they exploit the ''representation-based distances'' and Euclidean distances to determine the nearest neighbors of the test sample from the set of training samples, respectively. The analysis shows that the ''representation-based distances'' are able to take into account the dependent relationship between different training samples. Actually, the nearest neighbors determined by the proposed method are optimal from the point of view of representing the test sample. Moreover, the nearest neighbors obtained using our method contain less redundant information than those obtained using CKNNC. The experimental results show that CFKNNC can classify much more accurately than CKNNC and various improvements to CKNNC such as the nearest feature line (NFL) classifier, the nearest feature space (NFS) classifier, nearest neighbor line classifier (NNLC) and center-based nearest neighbor classifier (CBNNC).