Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
Mathematics and Computers in Simulation - IMACS sponsored Special issue on the second IMACS seminar on Monte Carlo methods
Functional Networks with Applications: A Neural-Based Paradigm
Functional Networks with Applications: A Neural-Based Paradigm
Multiclass Alternating Decision Trees
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Input Decimation Ensembles: Decorrelation through Dimensionality Reduction
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
An extensive empirical study of feature selection metrics for text classification
The Journal of Machine Learning Research
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
The class imbalance problem: A systematic study
Intelligent Data Analysis
Feature Selection Based on Sensitivity Analysis
Current Topics in Artificial Intelligence
Scalable Feature Selection for Multi-class Problems
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Capturing heuristics and intelligent methods for improving micro-array data classification
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Hi-index | 0.00 |
Feature selection algorithms should remove irrelevant and redundant features while maintaining or even improving performance, and thus contributing to enhance generalization in learning models. Feature selection methods can be mainly grouped into filters and wrappers. Most of the models built can deal more or less adequately with binary problems, but often under perform on multi-class tasks. In this article, a new wrapper method, called IAFN-FS (Incremental ANOVA and Functional Networks-Feature Selection) is described in its version for dealing with multiclass problems. In order to carry out the multiclass approach, two different alternatives were tried: (a) treating directly the multiclass problem, (b) dividing the original multiclass problem in several binary problems. In order to evaluate the performance of both approaches, a comparative study using several benchmark datasets, our two methods and other wrappers based in classical algorithms, such as C4.5 and Naive-Bayes, was carried out.