C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Maximum of entropy for credal sets
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Using neural network ensembles for bankruptcy prediction and credit scoring
Expert Systems with Applications: An International Journal
An experimental comparison of ensemble of classifiers for bankruptcy prediction and credit scoring
Expert Systems with Applications: An International Journal
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
Bagging schemes on the presence of class noise in classification
Expert Systems with Applications: An International Journal
Engineering Applications of Artificial Intelligence
Expert Systems with Applications: An International Journal
Classification with decision trees from a nonparametric predictive inference perspective
Computational Statistics & Data Analysis
Hi-index | 12.05 |
Previous studies about ensembles of classifiers for bankruptcy prediction and credit scoring have been presented. In these studies, different ensemble schemes for complex classifiers were applied, and the best results were obtained using the Random Subspace method. The Bagging scheme was one of the ensemble methods used in the comparison. However, it was not correctly used. It is very important to use this ensemble scheme on weak and unstable classifiers for producing diversity in the combination. In order to improve the comparison, Bagging scheme on several decision trees models is applied to bankruptcy prediction and credit scoring. Decision trees encourage diversity for the combination of classifiers. Finally, an experimental study shows that Bagging scheme on decision trees present the best results for bankruptcy prediction and credit scoring.