Autoregressive forecast of monthly total ozone concentration: A neurocomputing approach
Computers & Geosciences
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Feature subset selection using differential evolution and a statistical repair mechanism
Expert Systems with Applications: An International Journal
Improved binary particle swarm optimization using catfish effect for feature selection
Expert Systems with Applications: An International Journal
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
A scalable approach to simultaneous evolutionary instance and feature selection
Information Sciences: an International Journal
Hi-index | 0.01 |
The authors describe experiments using a genetic algorithm for feature selection in the context of neural network classifiers, specifically, counterpropagation networks. They present the novel techniques used in the application of genetic algorithms. First, the genetic algorithm is configured to use an approximate evaluation in order to reduce significantly the computation required. In particular, though the desired classifiers are counterpropagation networks, they use a nearest-neighbor classifier to evaluate features sets and show that the features selected by this method are effective in the context of counterpropagation networks. Second, a method called the training set sampling in which only a portion of the training set is used on any given evaluation, is proposed. Computational savings can be made using this method, i.e., evaluations can be made over an order of magnitude faster. This method selects feature sets that are as good as and occasionally better for counterpropagation than those chosen by an evaluation that uses the entire training set