The Strength of Weak Learnability
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Training methods for adaptive boosting of neural networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Computational Statistics & Data Analysis
Multilayer perceptron for simulation models reduction: Application to a sawmill workshop
Engineering Applications of Artificial Intelligence
Advances in Artificial Neural Systems
Hi-index | 0.01 |
Generating an architecture for an ensemble of boosting machines involves making a series of design decisions. One design decision is whether to use simple "weak learners" such as decision tree stumps or more complicated weak learners such as large decision trees or neural networks. Another design decision is the training algorithm for the constituent weak learners. Here we concentrate on binary decision trees and show that the best results are obtained using the Z-criterion to build the trees without pruning. In using neural networks, early stopping is recommended as an approach to lower the training time. In examining the multi-class boosting algorithms, the jury is still out on whether using the all-pairs binary learning algorithm or pseudo-loss is better.