IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical and neural classifiers: an integrated approach to design
Statistical and neural classifiers: an integrated approach to design
Proceedings of the Second International Workshop on Multiple Classifier Systems
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
A Framework for Classifier Fusion: Is It Still Needed?
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Data Complexity Analysis for Classifier Combination
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Experts' Boasting in Trainable Fusion Rules
IEEE Transactions on Pattern Analysis and Machine Intelligence
k-nearest neighbors directed noise injection in multilayer perceptron training
IEEE Transactions on Neural Networks
Multiple Classification Systems in the Context of Feature Extraction and Selection
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Experts' Boasting in Trainable Fusion Rules
IEEE Transactions on Pattern Analysis and Machine Intelligence
Trainable fusion rules. I. Large sample size case
Neural Networks
Trainable fusion rules. II. Small sample-size effects
Neural Networks
A genetic encoding approach for learning methods for combining classifiers
Expert Systems with Applications: An International Journal
Reducing the overconfidence of base classifiers when combining their decisions
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
On deriving the second-stage training set for trainable combiners
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Hi-index | 0.00 |
If no large design data set is available to design the Multiple classifier system, one typically uses the same data set to design both the expert classifiers and the fusion rule. In that case, the experts form an optimistically biased training data for a fusion rule designer. We consider standard Fisher linear and Euclidean distance classifiers used as experts and the single layer perceptron as a fusion rule. Original bias correction terms of experts answers are derived for these two types of expert classifiers under assumptions of high-variate Gaussian distributions. In addition, noise injection as a more universal technique is presented. Experiments with specially designed artificial Gaussian and real-world medical data showed that the theoretical bias correction works well in the case of high-variate artificial data and the noise injection technique is more preferable in the real-world problems.