A novel feature selection method based on normalized mutual information

  • Authors:
  • La The Vinh;Sungyoung Lee;Young-Tack Park;Brian J. D'Auriol

  • Affiliations:
  • Dept. of Computer Engineering, Kyung Hee University, Seoul, Korea;Dept. of Computer Engineering, Kyung Hee University, Seoul, Korea;School of IT, Soongsil University, Seoul, Korea;Dept. of Computer Engineering, Kyung Hee University, Seoul, Korea

  • Venue:
  • Applied Intelligence
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a novel feature selection method based on the normalization of the well-known mutual information measurement is presented. Our method is derived from an existing approach, the max-relevance and min-redundancy (mRMR) approach. We, however, propose to normalize the mutual information used in the method so that the domination of the relevance or of the redundancy can be eliminated. We borrow some commonly used recognition models including Support Vector Machine (SVM), k-Nearest-Neighbor (kNN), and Linear Discriminant Analysis (LDA) to compare our algorithm with the original (mRMR) and a recently improved version of the mRMR, the Normalized Mutual Information Feature Selection (NMIFS) algorithm. To avoid data-specific statements, we conduct our classification experiments using various datasets from the UCI machine learning repository. The results confirm that our feature selection method is more robust than the others with regard to classification accuracy.