The Strength of Weak Learnability
Machine Learning
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive mixtures of local experts
Neural Computation
A Balanced Ensemble Learning with Adaptive Error Functions
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Randomness in generalization ability: a source to improve it
IEEE Transactions on Neural Networks
Transition Learning by Negative Correlation Learning
Proceedings of the Second International Conference on Innovative Computing and Cloud Computing
Hi-index | 0.02 |
By introducing an adaptive error function, a balanced ensemble learning had been developed from negative correlation learning. In this paper, balanced ensemble learning had been used to train a set of small neural networks with one hidden node only. The experimental results suggest that balanced ensemble learning is able to create a strong ensemble by combining a set of weak learners. Different to bagging and boosting where learners are trained on randomly re-sampled data from the original set of patterns, learners could be trained on all available data in balanced ensemble learning. It is interesting to be discovered that learners by balanced ensemble learning could be just be slightly better than random guessing even if they had been trained on the whole data set. Another difference among these ensemble learning methods is that learners are trained simultaneously in balanced ensemble learning when learners are trained independently in bagging, and sequentially in boosting.