Original Contribution: Stacked generalization
Neural Networks
Combining the results of several neural network classifiers
Neural Networks
A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Virtually Binary Nature of Probabilistic Neural Networks
SSPR '98/SPR '98 Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Combining Multiple Classifiers in Probabilistic Neural Networks
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combinations of weak classifiers
IEEE Transactions on Neural Networks
Multiple network fusion using fuzzy logic
IEEE Transactions on Neural Networks
Decision Level Fusion of Intramodal Personal Identity Verification Experts
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Recognition of Properties by Probabilistic Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Hi-index | 0.00 |
We consider a general scheme of parallel classifier combinations in the framework of statistical pattern recognition. Each statistical classifier defines a set of output variables in terms of a posteriori probabilities, i.e. it is used as a feature extractor. Unlike usual combining schemes the output vectors of classifiers are combined in parallel. The statistical Shannon information is used as a criterion to compare different combining schemes from the point of view of the theoretically available decision information. By means of relatively simple arguments we derive a theoretical hierarchy between different schemes of classifier fusion in terms of information inequalities.