Implementing Dempster's rule for hierarchial evidence
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Combining Multiple Learning Strategies for Effective Cross Validation
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Experiments with Classifier Combining Rules
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
COMBINING MULTIPLE CLASSIFIERS USING DEMPSTER'S RULE FOR TEXT CATEGORIZATION
Applied Artificial Intelligence
Combining Prioritized Decisions in Classification
MDAI '07 Proceedings of the 4th international conference on Modeling Decisions for Artificial Intelligence
On combining multiple classifiers using an evidential approach
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
A new technique for combining multiple classifiers using the dempster-shafer theory of evidence
Journal of Artificial Intelligence Research
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Sequential genetic search for ensemble feature selection
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Hi-index | 0.00 |
Classifier outputs in the form of continuous values have often been combined using linear sum or stacking, but little is generally known about evidential reasoning methods for combining truncated lists of ordered decisions. In this paper we introduce a novel class-indifferent method for combining such a kind of classifier decisions. Specifically we model each output given by classifiers on new instances as a list of ranked decisions that is divided into 2 subsets of decisions, which are represented by triplet-based belief functionsand then are combined using Dempster's rule of combination. We present a formalism for triplet-based belief functions and establish a range of general formulae for combining these beliefs in order to arrive at a consensus decision. In addition we carry out a comparative analysis with an alternative representation dichotomous belief functionson the UCI benchmark data. We also compare our combination method with the popular methods of stacking, boosting, linear sum and majority voting over the same benchmark data to demonstrate the advantage of our approach.