C4.5: programs for machine learning
C4.5: programs for machine learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Classifier Combinations: Implementations and Theoretical Issues
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Simpler core vector machines with enclosing balls
Proceedings of the 24th international conference on Machine learning
An ensemble technique for stable learners with performance bounds
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Analysis of feature selection stability on high dimension and small sample data
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In feature selection the effect of over-fitting may lead to serious degradation of generalization ability. We introduce the concept of combining multiple feature selection criteria in feature selection methods with the aim to obtain feature subsets that generalize better. The concept is applicable with many existing feature selection methods. Here we discuss in more detail the family of sequential search methods. The concept does not specify which criteria to combine --- to illustrate its feasibility we give a simple example of combining the estimated accuracy of k-nearest neighbor classifiers for various k. We perform the experiments on a number of datasets. The potential to improve is clearly seen on improved classifier performance on independent test data as well as on improved feature selection stability.