C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Machine Learning
On Comparing Classifiers: Pitfalls toAvoid and a Recommended Approach
Data Mining and Knowledge Discovery
Machine Learning
Maximum of entropy for credal sets
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Handbook of Parametric and Nonparametric Statistical Procedures
Handbook of Parametric and Nonparametric Statistical Procedures
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
An introduction to the imprecise Dirichlet model for multinomial data
International Journal of Approximate Reasoning
Bagging schemes on the presence of class noise in classification
Expert Systems with Applications: An International Journal
Bagging decision trees on data sets with classification noise
FoIKS'10 Proceedings of the 6th international conference on Foundations of Information and Knowledge Systems
Analysis and extension of decision trees based on imprecise probabilities: Application on noisy data
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Decision trees are simple structures used in supervised classification learning. The results of the application of decision trees in classification can be notably improved using ensemble methods such as Bagging, Boosting or Randomization, largely used in the literature. Bagging outperforms Boosting and Randomization in situations with classification noise. In this paper, we present an experimental study of the use of different simple decision tree methods for bagging ensemble in supervised classification, proving that simple credal decision trees (based on imprecise probabilities and uncertainty measures) outperforms the use of classical decision tree methods for this type of procedure when they are applied on datasets with classification noise.