Entropy criterion for classifier-independent feature selection

  • Authors:
  • Naoto Abe;Mineichi Kudo

  • Affiliations:
  • Division of Computer Science, Graduate School of Information Science and Technology, Hokkaido University, Sapporo, Japan;Division of Computer Science, Graduate School of Information Science and Technology, Hokkaido University, Sapporo, Japan

  • Venue:
  • KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part IV
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Feature selection aims to select a feature subset that has discriminative information from the original feature set. In practice, we do not know what classifier is used beforehand, and it is preferable to find a feature subset that is universally effective for any classifier. Such a trial is called classifier-independent feature selection and can be made by removing garbage features that have no discriminative information. However, it is difficult to distinguish only garbage features from the others. In this study, we propose an entropy criterion for this goal and confirm the effectiveness through a synthetic dataset.