Divergence-based feature selection for separate classes

  • Authors:
  • Yishi Zhang;Shujuan Li;Teng Wang;Zigang Zhang

  • Affiliations:
  • School of Management, Huazhong University of Science and Technology, 1037 Luoyu Road, Hongshan District, Wuhan 430074, China;School of Management, Huazhong University of Science and Technology, 1037 Luoyu Road, Hongshan District, Wuhan 430074, China;School of Software Engineering, Huazhong University of Science and Technology, Wuhan 430074, China;School of Management, Huazhong University of Science and Technology, 1037 Luoyu Road, Hongshan District, Wuhan 430074, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Feature selection is one of the core issues in designing pattern recognition systems and has attracted considerable attention in the literature. Most of the feature selection methods in the literature only handle relevance and redundancy analysis from the point of view of the whole class, which neglects the relation of features and the separate classes. In this paper, we propose a novel feature selection framework to explicitly handle the relevance and redundancy analysis for each class label. Then we propose two simple and effective feature selection algorithms based on this framework and Kullback-Leibler divergence. An empirical study is conducted to evaluate the efficiency and effectiveness of our algorithms comparing with five representative feature selection algorithms. Empirical results show that our proposed algorithms are efficient and outperform the selected algorithms in most cases, and show the superiority of our proposed feature selection framework.