The Strength of Weak Learnability
Machine Learning
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Integrated neural network ensemble algorithm based on clustering technology
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
A component neural networks parallel training algorithm PLA is proposed, which encourages component neural network to learn from expected goal and the others, so all component neural networks are trained simultaneously and interactively In the stage of combining component neural networks, we provide a parallel weight optimal approach GASEN-e by expanding GASEN proposed by Zhou et al, which assign weight for every component neural network and bias for their ensemble Experiment results show that a neural networks ensemble system is efficient constructed by PLA and GASEN-e.