The weighted majority algorithm
Information and Computation
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Experimental comparisons of online and batch versions of bagging and boosting
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Mining concept-drifting data streams using ensemble classifiers
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts
The Journal of Machine Learning Research
A Practical Approach to Classify Evolving Data Streams: Training with Limited Amount of Labeled Data
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
New ensemble methods for evolving data streams
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Combining Time and Space Similarity for Small Size Learning under Concept Drift
ISMIS '09 Proceedings of the 18th International Symposium on Foundations of Intelligent Systems
Maintaining time-decaying stream aggregates
Journal of Algorithms
The Impact of Diversity on Online Ensemble Learning in the Presence of Concept Drift
IEEE Transactions on Knowledge and Data Engineering
Knowledge Discovery from Data Streams
Knowledge Discovery from Data Streams
The Journal of Machine Learning Research
Leveraging bagging for evolving data streams
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
Evaluating Learning Algorithms: A Classification Perspective
Evaluating Learning Algorithms: A Classification Perspective
Accuracy updated ensemble for data streams with concept drift
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part II
Batch weighted ensemble for mining data streams with concept drift
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
ACE: adaptive classifiers-ensemble system for concept-drifting environments
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
DDD: A New Ensemble Approach for Dealing with Concept Drift
IEEE Transactions on Knowledge and Data Engineering
Incremental Learning of Concept Drift in Nonstationary Environments
IEEE Transactions on Neural Networks
Properties of rule interestingness measures and alternative approaches to normalization of measures
Information Sciences: an International Journal
On evaluating stream learning algorithms
Machine Learning
Hi-index | 0.07 |
Most stream classifiers are designed to process data incrementally, run in resource-aware environments, and react to concept drifts, i.e., unforeseen changes of the stream's underlying data distribution. Ensemble classifiers have become an established research line in this field, mainly due to their modularity which offers a natural way of adapting to changes. However, in environments where class labels are available after each example, ensembles which process instances in blocks do not react to sudden changes sufficiently quickly. On the other hand, ensembles which process streams incrementally, do not take advantage of periodical adaptation mechanisms known from block-based ensembles, which offer accurate reactions to gradual and incremental changes. In this paper, we analyze if and how the characteristics of block and incremental processing can be combined to produce new types of ensemble classifiers. We consider and experimentally evaluate three general strategies for transforming a block ensemble into an incremental learner: online component evaluation, the introduction of an incremental learner, and the use of a drift detector. Based on the results of this analysis, we put forward a new incremental ensemble classifier, called Online Accuracy Updated Ensemble, which weights component classifiers based on their error in constant time and memory. The proposed algorithm was experimentally compared with four state-of-the-art online ensembles and provided best average classification accuracy on real and synthetic datasets simulating different drift scenarios.