Feature subset selection using differential evolution

  • Authors:
  • Rami N. Khushaba;Ahmed Al-Ani;Adel Al-Jumaily

  • Affiliations:
  • Faculty of Engineering and Information Technology, University of Technology, Sydney, Australia;Faculty of Engineering and Information Technology, University of Technology, Sydney, Australia;Faculty of Engineering and Information Technology, University of Technology, Sydney, Australia

  • Venue:
  • ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements.