The Strength of Weak Learnability
Machine Learning
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Obtaining Accurate Neural Network Ensembles
CIMCA '05 Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce Vol-2 (CIMCA-IAWTIC'06) - Volume 02
Genetic algorithm based selective neural network ensemble
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Learning regression ensembles with genetic programming at scale
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
We have recently proposed a novel algorithm for ensemble creation called GEMS (Genetic Ensemble Member Selection). GEMS first trains a fixed number of neural networks (here twenty) and then uses genetic programming to combine these networks into an ensemble. The use of genetic programming makes it possible for GEMS to not only consider ensembles of different sizes, but also to use ensembles as intermediate building blocks. In this paper, which is the first extensive study of GEMS, the representation language is extended to include tests partitioning the data, further increasing flexibility. In addition, several micro techniques are applied to reduce overfitting, which appears to be the main problem for this powerful algorithm. The experiments show that GEMS, when evaluated on 15 publicly available data sets, obtains very high accuracy, clearly outperforming both straightforward ensemble designs and standard decision tree algorithms.