Machine Learning
Density-Based Multiscale Data Condensation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Performance in Neural Networks Using a Boosting Algorithm
Advances in Neural Information Processing Systems 5, [NIPS Conference]
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
An Improved Bagging Neural Network Ensemble Algorithm and Its Application
ICNC '07 Proceedings of the Third International Conference on Natural Computation - Volume 05
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
On optimum choice of k in nearest neighbor classification
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In this work we propose a new method to create neural network ensembles. Our methodology develops over the conventional technique of bagging , where multiple classifiers are trained using a single training data set by generating multiple bootstrap samples from the training data. We propose a new method of sampling using the k -nearest neighbor density estimates. Our sampling technique gives rise to more variability in the data sets than by bagging. We validate our method by testing on several real data sets and show that our method outperforms bagging.