Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fusion of handwritten word classifiers
Pattern Recognition Letters - Special issue on fuzzy set technology in pattern recognition
Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Decisions and evaluations by hierarchical aggregation of information
Fuzzy Sets and Systems
Boosting with averaged weight vectors
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Combining MF networks: a comparison among statistical methods and stacked generalization
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Hi-index | 0.00 |
Training an ensemble of neural networks is an interesting way to build a Multi-net System. One of the key factors to design an ensemble is how to combine the networks to give a single output. Although there are some important methods to build ensembles, Boostingis one of the most important ones. Most of methods based on Boostinguse an specific combiner (Boosting Combiner). Although the Boosting combinerprovides good results on boosting ensembles, the results of previouses papers show that the simple combiner Output Averagecan work better than the Boosting combiner. In this paper, we study the performance of sixteen different combination methods for ensembles previously trained with Adaptive Boostingand Average Boosting. The results show that the accuracy of the ensembles trained with these original boosting methods can be improved by using the appropriate alternative combiner.