The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Genetic algorithm based selective neural network ensemble
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
In view of comparability between neural network ensemble and Adaline, a modification algorithm for ensemble weights is presented based on the gradient descent method. This algorithm can improve the generalization performance by modifying subnet weights after the ensemble subnets are trained individually. Simulation results indicate that the new algorithm is of subnet selection function similar to GASEN but on a different idea, and of better performance than single network, simple-averaged ensemble and GASEN.