The impact of feature extraction on the performance of a classifier: kNN, Naïve Bayes and C4.5

  • Authors:
  • Mykola Pechenizkiy

  • Affiliations:
  • Dept of Computer Science and Information Systems, University of Jyväskylä, Jyväskylä, Finland

  • Venue:
  • AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

“The curse of dimensionality” is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and the classification error in high dimensions In this paper, different feature extraction techniques as means of (1) dimensionality reduction, and (2) constructive induction are analyzed with respect to the performance of a classifier Three commonly used classifiers are taken for the analysis: kNN, Naïve Bayes and C4.5 decision tree One of the main goals of this paper is to show the importance of the use of class information in feature extraction for classification and (in)appropriateness of random projection or conventional PCA to feature extraction for classification for some data sets Two eigenvector-based approaches that take into account the class information are analyzed The first approach is parametric and optimizes the ratio of between-class variance to the within-class variance of the transformed data The second approach is a nonparametric modification of the first one based on the local calculation of the between-class covariance matrix In experiments on benchmark data sets these two approaches are compared with each other, with conventional PCA, with random projection and with plain classification without feature extraction for each classifier.