An introduction to computational learning theory
An introduction to computational learning theory
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Online ensemble learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering data manipulation methods for the development of local specialists
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
A boosting approach to remove class label noise
International Journal of Hybrid Intelligent Systems - Hybrid Intelligent systems in Ensembles
Decision Fusion on Boosting Ensembles
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
Improving adaptive boosting with a relaxed equation to update the sampling distribution
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Creating ensembles of classifiers via fuzzy clustering and deflection
Fuzzy Sets and Systems
Averaged conservative boosting: introducing a new method to build ensembles of neural networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Adaboost with totally corrective updates for fast face detection
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Ensembles of multilayer feedforward: a new comparison
NN'05 Proceedings of the 6th WSEAS international conference on Neural networks
New results on ensembles of multilayer feedforward
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Improving adaptive boosting with k-cross-fold validation
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Ensembles of multilayer feedforward: some new results
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Improving boosting methods by generating specific training and validation sets
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
New adaboost algorithm based on interval-valued fuzzy sets
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
AdaBoost [5] is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence [7]. The idea is to make the next base model's errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution orthogonal to the mistake vectors of all the previous base models, but that this is not always possible [7]. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm [7], which also attempts to satisfy this goal.