Fast k Most Similar Neighbor Classifier for Mixed Data Based on a Tree Structure and Approximating-Eliminating

  • Authors:
  • Selene Hernández-Rodríguez;J. A. Carrasco-Ochoa;J. Fco. Martínez-Trinidad

  • Affiliations:
  • Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México CP: 72840;Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México CP: 72840;Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México CP: 72840

  • Venue:
  • CIARP '08 Proceedings of the 13th Iberoamerican congress on Pattern Recognition: Progress in Pattern Recognition, Image Analysis and Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

The knearest neighbor (k-NN) classifier has been extensively used as a nonparametric technique in Pattern Recognition. However, in some applications where the training set is large, the exhaustive k-NNclassifier becomes impractical. Therefore, many fast k-NNclassifiers have been developed to avoid this problem. Most of these classifiers rely on metric properties, usually the triangle inequality, to reduce the number of prototype comparisons. However, in soft sciences, the prototypes are usually described by qualitative and quantitative features (mixed data), and sometimes the comparison function does not satisfy the triangle inequality. Therefore, in this work, a fast kmost similar neighbor (k-MSN) classifier, which uses a Tree structure and an Approximating and Eliminating approach for Mixed Data, not based on metric properties (Tree AEMD), is introduced. The proposed classifier is compared against other fast k-NNclassifiers.