Relevance and Redundancy Analysis for Ensemble Classifiers
MLDM '09 Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition
Bootstrap feature selection for ensemble classifiers
ICDM'10 Proceedings of the 10th industrial conference on Advances in data mining: applications and theoretical aspects
Hybrid feature selection by combining filters and wrappers
Expert Systems with Applications: An International Journal
Distributed learning with data reduction
Transactions on computational collective intelligence IV
Hi-index | 0.00 |
Feature selection is used to eliminate irrelevant and redundant features, which improves prediction accuracy and reduces the computational overhead in classification. This paper presents comparison of 3 methods namely Fast Correlation Based Feature Selection (FCBF), Multi thread based FCBF feature selection and Decision Dependent Decision Independent Correlation (DDC-DIC). These approaches are concerning the relevance of the features and the pair wise features correlation for redundancy checking in order to improve the prediction accuracy and reduce the computation time. The experimental results are tested in weka tool for C4.5Decision tree construction algorithm, which provide better performance for lung cancer, Tic 2000 Insurance company data and breast cancer data sets. Keywords. Feature Selection, Correlation, Relevance, Redundancy.