The Strength of Weak Learnability
Machine Learning
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive mixtures of local experts
Neural Computation
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Randomness in generalization ability: a source to improve it
IEEE Transactions on Neural Networks
Balanced Learning for Ensembles with Small Neural Networks
ISICA '09 Proceedings of the 4th International Symposium on Advances in Computation and Intelligence
Transition Learning by Negative Correlation Learning
Proceedings of the Second International Conference on Innovative Computing and Cloud Computing
Hi-index | 0.00 |
In the practice of designing neural network ensembles, it is common that a certain learning error function is defined and kept the same or fixed for each individual neural network in the whole learning process. Such fixed learning error function not only likely leads to over-fitting, but also makes learning slow on hard-learned data points in the data set. This paper presents a novel balanced ensemble learning approach that could make learning fast and robust. The idea of balanced ensemble learning is to define adaptive learning error functions for different individual neural networks in an ensemble, in which different individuals could have different formats of error functions in the learning process, and these error functions could be changed as well. Through shifting away from well-learned data and focusing on not-yet-learned data by changing error functions for each individual among the ensemble, a good balanced learning could be achieved for the learned ensemble.