Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Combining feature subsets in feature selection
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
The ANNIGMA-wrapper approach to fast feature selection for neuralnets
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Accuracy/Diversity and Ensemble MLP Classifier Design
IEEE Transactions on Neural Networks
Feature Ranking Ensembles for Facial Action Unit Classification
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
Ensemble Approaches to Facial Action Unit Classification
CIARP '08 Proceedings of the 13th Iberoamerican congress on Pattern Recognition: Progress in Pattern Recognition, Image Analysis and Applications
Hi-index | 0.00 |
Selecting the optimal number of features in a classifier ensemble normally requires a validation set or cross-validation techniques. In this paper, feature ranking is combined with Recursive Feature Elimination (RFE), which is an effective technique for eliminating irrelevant features when the feature dimension is large. Stopping criteria are based on out-of-bootstrap (OOB) estimate and class separability, both computed on the training set thereby obviating the need for validation. Multi-class problems are solved using the Error-Correcting Output Coding (ECOC) method. Experimental investigation on natural benchmark data demonstrates the effectiveness of these stopping criteria.