Synergy of clustering multiple back propagation networks
Advances in neural information processing systems 2
Original Contribution: Stacked generalization
Neural Networks
Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Methods for combining experts' probability assessments
Neural Computation
Machine Learning
Using a Neural Network to Approximate an Ensemble of Classifiers
Neural Processing Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
Neural network ensembles (some times referred to as committees or classifier ensembles) are effective techniques to improve the generalization of a neural network system. Combining a set of neural network classifiers whose error distributions are diverse can lead to generating more accurate results than any single network. Combination strategies commonly used in ensembles include simple averaging, weighted averaging, majority voting and ranking. However, each method has its limitations, dependent either on the application areas it is suited to, or due to its effectiveness. This paper proposes a new ensembles combination scheme called multistage neural network ensembles. Experimental investigations based on multistage neural network ensembles are presented, and the benefit of using this approach as an additional combination method in ensembles is demonstrated.