Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fusion of handwritten word classifiers
Pattern Recognition Letters - Special issue on fuzzy set technology in pattern recognition
Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Decisions and evaluations by hierarchical aggregation of information
Fuzzy Sets and Systems
Combining MF networks: a comparison among statistical methods and stacked generalization
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
The design of an ensemble of neural networks is a procedure that can be decomposed into two steps. The first one consists in generating the ensemble, i.e., training the networks with significant differences. The second one consists in combining properly the information provided by the networks. Adaptive Boosting, one of the best performing ensemble methods, has been studied and improved by some authors including us. Moreover, Adaboostand its variantsuse a specific combiner based on the error of the networks. Unfortunately, any deep study on combining this kind of ensembles has not been done yet. In this paper, we study the performance of some important ensemble combiners on ensembles previously trained with Adaboostand Aveboost. The results show that an extra increase of performance can be provided by applying the appropriate combiner.