How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Hi-index | 0.01 |
In this paper we present a novel approach to combining classifiers in the Dempster-Shafer theory framework. This approach models each output given by classifiers as a list of ranked decisions (classes), which is partitioned into a new evidence structure called a triplet. Resulting triplets are then combined by Dempster's rule. With a triplet, its first subset contains a decision corresponding to the largest numeric value of classes, the second subset corresponds to the second largest numeric value and the third subset represents uncertainty information in determining the support for the former two decisions. We carry out a comparative analysis with the combination methods of majority voting, stacking and boosting on the UCI benchmark data to demonstrate the advantage of our approach.