Machine Learning
Machine Learning
Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Handling missing values in support vector machine classifiers
Neural Networks - 2005 Special issue: IJCNN 2005
Uncertainty and Information: Foundations of Generalized Information Theory
Uncertainty and Information: Foundations of Generalized Information Theory
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In this paper, we present an experimental comparison among different strategies for combining decision trees built by means of imprecise probabilities and uncertainty measures. It has been proven that the combination or fusion of the information obtained from several classifiers can improve the final process of the classification. We use previously developed schemes, known as Bagging and Boosting, along with a new one based on the variation of the root node via the information rank of each feature of the class variable. To this end, we applied two different approaches to deal with missing data and continuous variables. We use a set of tests on the performance of the methods analyzed here, to show that, with the appropriate approach, the Boosting scheme constitutes an excellent way to combine this type of decision tree. It should be noted that it provides good results, even compared with a standard Random Forest classifier, a successful procedure very commonly used in the literature.