A novel nearest neighbor classifier based on adaptive nonparametric separability

  • Authors:
  • Bor-Chen Kuo;Hsin-Hua Ho;Cheng-Hsuan Li;Ya-Yuan Chang

  • Affiliations:
  • Department of Graduate School of Educational Measurement and Statistics, National Taichung University, Taiwan, Taichung, Taiwan, R.O.C.;Department of Graduate School of Educational Measurement and Statistics, National Taichung University, Taiwan, Taichung, Taiwan, R.O.C.;Department of Applied Mathematics, Feng Chia University, Taiwan, Taichung, Taiwan, R.O.C.;Department of Graduate School of Educational Measurement and Statistics, National Taichung University, Taiwan, Taichung, Taiwan, R.O.C.

  • Venue:
  • AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A k-nearest-neighbor classifier expects the class conditional probabilities to be locally constant. In this paper, we use the local separability based on NWFE criterion to establish an effective metric for computing a new neighborhood. For each test pattern, the modified neighborhood shrinks in the direction with high separability around this pattern and extends further in the other direction. This new neighborhood can often provide improvement in classification performance. Therefore, any neighborhood-based classifier can be employed by using the modified neighborhood. Then the class conditional probabilities tend to be more homogeneous in the modified neighborhood.