C4.5: programs for machine learning
C4.5: programs for machine learning
Technical Note: Bias and the Quantification of Stability
Machine Learning - Special issue on bias evaluation and selection
Machine Learning
Knowledge Acquisition form Examples Vis Multiple Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Exploiting the Cost (In)sensitivity of Decision Tree Splitting Criteria
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Combining multiple class distribution modified subsamples in a single tree
Pattern Recognition Letters
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Hi-index | 0.00 |
Being aware of the importance of classifiers to be comprehensible when using machine learning to solve real world problems, bagging needs a way to be explained. This work compares Consolidated Tree's Construction (CTC) algorithm with the Combined Multiple Models (CMM) method proposed by Domingos when used to extract explanation of the classification made by bagging. The comparison has been done from two main points of view: accuracy, and quality of the provided explanation. From the experimental results we can conclude that it is recommendable the use of CTC rather than the use of CMM. From the accuracy point of view, the behaviour of CTC is nearer the behaviour of bagging than CMM's one. And, analysing the complexity of the obtained classifiers, we can say that Consolidated Trees (CT trees) will give simpler and, therefore, more comprehensible explanation than CMM classifiers. And besides, looking to the stability of the structure of the built trees, we could say that the explanation given by CT trees is steadier than the one given by CMM classifiers. As a consequence, the user of the classifier will feel more confident using CTC than using CMM.