Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition Letters
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Hi-index | 0.00 |
We investigate the diversity of classifiers combined using different combiner systems. Three diversity measures are used to calculate the diversity of combined classifiers. We aim to find which combiner design method yields most diverse classifiers. We also aim to find if diversity measures are good indicators of system performances. The system combiner types are; Bagging and a conventional three classifier system, in which three classifier types are used; backpropagation neural network, bayesian and k-nearest neighbor classifiers. Results obtained on real data indicate the system with the higher performance yields more diverse classifiers, therefore, diversity measure is related to the system performance. We also found that bagging yields more diverse classifiers if neural network classifiers are used. However, the mixed classifier system yields more diverse classifiers if k-NN or bayes classifiers are used. This was in line with system performances.