A Divergence Criterion for Classifier-Independent Feature Selection

  • Authors:
  • Naoto Abe;Mineichi Kudo;Jun Toyama;Masaru Shimbo

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Feature selection aims to find the most important feature subset from a given feature set without degradation of discriminative information. In general, we wish to select a feature subset that is effective for any kind of classifier. Such studies are called Classifier-Independent Feature Selection, and Novovičová et al.'s method is one of them. Their method estimates the densities of classes with Gaussian mixture models, and selects a feature subset using Kullback-Leibler divergence between the estimated densities, but there is no indication how to choose the number of features to be selected. Kudo and Sklansky (1997) suggested the selection of a minimal feature subset such that the degree of degradation of performance is guaranteed. In this study, based on their suggestion, we try to find a feature subset that is minimal while maintainig a given Kullback-Leibler divergence.