C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Inference for the Generalization Error
Machine Learning
Benchmarking Attribute Selection Techniques for Discrete Class Data Mining
IEEE Transactions on Knowledge and Data Engineering
Maximum of entropy for credal sets
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Uncertainty and Information: Foundations of Generalized Information Theory
Uncertainty and Information: Foundations of Generalized Information Theory
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
An introduction to the imprecise Dirichlet model for multinomial data
International Journal of Approximate Reasoning
Hi-index | 0.00 |
In the field of attribute mining, several feature selection methods have recently appeared indicating that the use of sets of decision trees learnt from a data set can be an useful tool for selecting relevant and informative variables regarding to a main class variable. With this aim, in this study, we claim that the use of a new split criterion to build decision trees outperforms another classic split criterions for variable selection purposes. We present an experimental study on a wide and different set of databases using only one decision tree with each split criterion to select variables for the Naive Bayes classifier.