Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Experimental comparisons of online and batch versions of bagging and boosting
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Mining concept-drifting data streams using ensemble classifiers
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
New ensemble methods for evolving data streams
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Knowledge-Based sampling for subgroup discovery
LPD'04 Proceedings of the 2004 international conference on Local Pattern Detection
Online non-stationary boosting
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
A novel online boosting algorithm for automatic anatomy detection
Machine Vision and Applications
Hi-index | 0.00 |
Methods involving ensembles of classifiers, such as bagging and boosting, are popular due to the strong theoretical guarantees for their performance and their superior results. Ensemble methods are typically designed by assuming the training data set is static and completely available at training time. As such, they are not suitable for online and incremental learning. In this paper we propose IBoost, an extension of AdaBoost for incremental learning via optimization of an exponential cost function which changes over time as the training data changes. The resulting algorithm is flexible and allows a user to customize it based on the computational constraints of the particular application. The new algorithm was evaluated on stream learning in presence of concept change. Experimental results showed that IBoost achieves better performance than the original AdaBoost trained from scratch each time the data set changes, and that it also outperforms previously proposed Online Coordinate Boost, Online Boost and its non-stationary modifications, Fast and Light Boosting, ADWIN Online Bagging and DWM algorithms.