Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Boosting and other ensemble methods
Neural Computation
Boosting with averaged weight vectors
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Hi-index | 0.00 |
As seen in the bibliography, Adaptive Boosting (Adaboost) is one of the most known methods to increase the performance of an ensemble of neural networks. We introduce a new method based on Adaboost where we have applied Cross-Validation to increase the diversity of the ensemble. We have used Cross-Validation over the whole learning set to generate an specific training set and validation set for each network of the committee. We have tested Adaboost and Crossboost with seven databases from the UCI repository. We have used the mean percentage of error reduction and the mean increase of performance to compare both methods, the results show that Crossboost performs better.