Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
The strength of weak learnability
SFCS '89 Proceedings of the 30th Annual Symposium on Foundations of Computer Science
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
Boosting by weighting critical and erroneous samples
Neurocomputing
Hi-index | 0.00 |
Multi Net Systems have become very popular during the last decade. A great variety of techniques have been proposed: many of them resulting in excellent performance in recognition tasks. In this paper, we will show that focusing on the hardest patterns plays a crucial role in Adaboost, one of the most widely used multi net systems. To do this, we use a novel technique to illustrate how Adaboost effectively focuses its training in the regions near the decision border. Then we propose a new method for training multi net systems that shares this property with Adaboost. Both schemes are shown, when tested on three benchmark datasets, to outperform single nets and an ensemble system in which the training sets are held constant, and the component members differ only as a result of randomness introduced during training. Their better performance supports the notion of the beneficial effects that can result from an increasing focus on hard patterns.