Machine Learning
A penalty-function approach for pruning feedforward neural networks
Neural Computation
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Performance in Neural Networks Using a Boosting Algorithm
Advances in Neural Information Processing Systems 5, [NIPS Conference]
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
Classifier Ensembles with a Random Linear Oracle
IEEE Transactions on Knowledge and Data Engineering
An Improved Bagging Neural Network Ensemble Algorithm and Its Application
ICNC '07 Proceedings of the Third International Conference on Natural Computation - Volume 05
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper we present a new method to create neural network ensembles. In an ensemble method like bagging one needs to train multiple neural networks to create the ensemble. Here we present a scheme to generate different copies of a network from one trained network, and use those copies to create the ensemble. The copies are produced by adding controlled noise to a trained base network. We provide a preliminary theoretical justification for our method and experimentally validate the method on several standard data sets. Our method can improve the accuracy of a base network and give rise to considerable savings in training time compared to bagging.