Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
Mathematics and Computers in Simulation - IMACS sponsored Special issue on the second IMACS seminar on Monte Carlo methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
An introduction to variable and feature selection
The Journal of Machine Learning Research
Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models
Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
A hybrid approach for feature subset selection using neural networks and ant colony optimization
Expert Systems with Applications: An International Journal
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
The Journal of Machine Learning Research
Functional networks and analysis of variance for feature selection
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
A Wrapper Method for Feature Selection in Multiple Classes Datasets
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Hi-index | 0.00 |
In this paper an incremental version of the ANOVA and Functional Networks Feature Selection (AFN-FS) method is presented. This new wrapper method (IAFN-FS) is based on an incremental functional decomposition, thus eliminating the main drawback of the basic method: the exponential complexity of the functional decomposition. This complexity limited its scope of applicability, being only applicable to datasets with a relatively small number of features. The performance of the incremental version of the method was tested against several real data sets. The results show that IAFN-FS outperforms the accuracy obtained by other standard and novel feature selection methods, using a small set of features.