Kernel discriminant analysis based feature selection

  • Authors:
  • Tsuneyoshi Ishii;Masamichi Ashihara;Shigeo Abe

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Rokkodai, Nada, Kobe, Japan;Graduate School of Engineering, Kobe University, Rokkodai, Nada, Kobe, Japan;Graduate School of Engineering, Kobe University, Rokkodai, Nada, Kobe, Japan

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis (KDA). The first one is the objective function of kernel discriminant analysis called the KDA criterion. We show that the KDA criterion is monotonic for the deletion of features, which ensures stable feature selection. The second one is the recognition rate obtained by a KDA classifier, called the KDA-based recognition rate, which is defined in the one-dimensional space obtained by KDA. Namely, a conditional probability of a datum for a given class is calculated and the datum is classified into the class with the maximum conditional probability. To ensure stable feature selection, we evaluate the KDA-based recognition rate by cross-validation. By computer experiments we compare the two criteria for two-class problems and the recognition rate of the support vector Machine (SVM) evaluated by cross-validation, called the SVM-based recognition rate. The selection performance of the KDA criterion and the KDA-based recognition rate is comparable and is better than that by the SVM-based recognition rate.