A Maximum Class Distance Support Vector Machine-Based Algorithm for Recursive Dimension Reduction
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Feature Selection with Single-Layer Perceptrons for a Multicentre 1H-MRS Brain Tumour Database
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Discriminative semi-supervised feature selection via manifold regularization
IEEE Transactions on Neural Networks
Advances in Artificial Neural Systems
A survey on feature selection methods
Computers and Electrical Engineering
Hi-index | 0.00 |
An experimental study on two decision issues for wrapper feature selection (FS) with multilayer perceptrons and the sequential backward selection (SBS) procedure is presented. The decision issues studied are the stopping criterion and the network retraining before computing the saliency. Experimental results indicate that the increase in the computational cost associated with retraining the network with every feature temporarily removed before computing the saliency is rewarded with a significant performance improvement. Despite being quite intuitive, this idea has been hardly used in practice. A somehow nonintuitive conclusion can be drawn by looking at the stopping criterion, suggesting that forcing overtraining may be as useful as early stopping. A significant improvement in the overall results with respect to learning with the whole set of variables is observed.