C4.5: programs for machine learning
C4.5: programs for machine learning
The nature of statistical learning theory
The nature of statistical learning theory
Divergence Based Feature Selection for Multimodal Class Densities
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
A Recursive Partitioning Decision Rule for Nonparametric Classification
IEEE Transactions on Computers
Hi-index | 0.00 |
Feature selection aims to select a feature subset that has discriminative information from the original feature set. In practice, we do not know what classifier is used beforehand, and it is preferable to find a feature subset that is universally effective for any classifier. Such a trial is called classifier-independent feature selection and can be made by removing garbage features that have no discriminative information. However, it is difficult to distinguish only garbage features from the others. In this study, we propose an entropy criterion for this goal and confirm the effectiveness through a synthetic dataset.