Original Contribution: Stacked generalization
Neural Networks
Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining the results of several neural network classifiers
Neural Networks
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Experimental evaluation of expert fusion strategies
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Fusion of n-Tuple Based Classifiers for High Performance Handwritten Character Recognition
Proceedings of the Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Hi-index | 0.00 |
The veto effect caused by contradicting experts outputting zero probability estimates leads to fusion strategies performing sub optimally. This can be resolved using Moderation. The Moderation formula is derived for the k-NN classifier using a bayesian prior. The merits of moderation are examined on real data sets.