Feature Selection for MLP Neural Network: The Use of Random Permutation of Probabilistic Outputs

  • Authors:
  • Jian-Bo Yang;Kai-Quan Shen;Chong-Jin Ong;Xiao-Ping Li

  • Affiliations:
  • Dept. of Mech. Eng., Nat. Univ. of Singapore, Singapore, Singapore;-;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new wrapper-based feature selection method for multilayer perceptron (MLP) neural networks. It uses a feature ranking criterion to measure the importance of a feature by computing the aggregate difference, over the feature space, of the probabilistic outputs of the MLP with and without the feature. Thus, a score of importance with respect to every feature can be provided using this criterion. Based on the numerical experiments on several artificial and real-world data sets, the proposed method performs, in general, better than several selected feature selection methods for MLP, particularly when the data set is sparse or has many redundant features. In addition, as a wrapper-based approach, the computational cost for the proposed method is modest.