The Strength of Weak Learnability
Machine Learning
Neural networks and the bias/variance dilemma
Neural Computation
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Genetic algorithm based selective neural network ensemble
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A constructive algorithm for training cooperative neural network ensembles
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A gradient-based algorithm for ensemble weights modification is presented and applied on the regression tasks. Simulation results show that this method can produce an estimator ensemble with better generalization than those of bagging and single neural network. The method can not only have a similar function to GASEN of selecting many subnets from all trained networks, but also be of better performance than GASEN, bagging and best individual of regressive estimators.