Fast generic selection of features for neural network classifiers

  • Authors:
  • F. Z. Brill;D. E. Brown;W. N. Martin

  • Affiliations:
  • Inst. for Parallel Comput., Virginia Univ., Charlottesville, VA;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1992

Quantified Score

Hi-index 0.01

Visualization

Abstract

The authors describe experiments using a genetic algorithm for feature selection in the context of neural network classifiers, specifically, counterpropagation networks. They present the novel techniques used in the application of genetic algorithms. First, the genetic algorithm is configured to use an approximate evaluation in order to reduce significantly the computation required. In particular, though the desired classifiers are counterpropagation networks, they use a nearest-neighbor classifier to evaluate features sets and show that the features selected by this method are effective in the context of counterpropagation networks. Second, a method called the training set sampling in which only a portion of the training set is used on any given evaluation, is proposed. Computational savings can be made using this method, i.e., evaluations can be made over an order of magnitude faster. This method selects feature sets that are as good as and occasionally better for counterpropagation than those chosen by an evaluation that uses the entire training set