International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Machine Learning
Machine Learning
Maximum of entropy for credal sets
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Uncertainty and Information: Foundations of Generalized Information Theory
Uncertainty and Information: Foundations of Generalized Information Theory
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
An introduction to the imprecise Dirichlet model for multinomial data
International Journal of Approximate Reasoning
Expert Systems with Applications: An International Journal
Classification with decision trees from a nonparametric predictive inference perspective
Computational Statistics & Data Analysis
Analysis and extension of decision trees based on imprecise probabilities: Application on noisy data
Expert Systems with Applications: An International Journal
A random forest classifier for lymph diseases
Computer Methods and Programs in Biomedicine
Expert Systems with Applications: An International Journal
Hi-index | 12.06 |
In this paper, we study one application of Bagging credal decision tree, i.e. decision trees built using imprecise probabilities and uncertainty measures, on data sets with class noise (data sets with wrong assignations of the class label). For this aim, previously we also extend a original method that build credal decision trees to one which works with continuous features and missing data. Through an experimental study, we prove that Bagging credal decision trees outperforms more complex Bagging approaches on data sets with class noise. Finally, using a bias-variance error decomposition analysis, we also justify the performance of the method of Bagging credal decision trees, showing that it achieves a stronger reduction of the variance error component.