Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Mining concept-drifting data streams using ensemble classifiers
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Systematic data selection to mine concept-drifting data streams
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Forecasting Skewed Biased Stochastic Ozone Days: Analyses and Solutions
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Knowledge Discovery from Data Streams
Knowledge Discovery from Data Streams
The Journal of Machine Learning Research
An attempt to employ genetic fuzzy systems to predict from a data stream of premises transactions
SUM'12 Proceedings of the 6th international conference on Scalable Uncertainty Management
An analysis of change trends by predicting from a data stream using genetic fuzzy systems
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part I
RCD: A recurring concept drift framework
Pattern Recognition Letters
Combining block-based and online methods in learning ensembles from concept drifting data streams
Information Sciences: an International Journal
A Domain Knowledge as a Tool For Improving Classifiers
Fundamenta Informaticae - To Andrzej Skowron on His 70th Birthday
Hi-index | 0.00 |
In this paper we study the problem of constructing accurate block-based ensemble classifiers from time evolving data streams. AWE is the best-known representative of these ensembles. We propose a new algorithm called Accuracy Updated Ensemble (AUE), which extends AWE by using online component classifiers and updating them according to the current distribution. Additional modifications of weighting functions solve problems with undesired classifier excluding seen in AWE. Experiments with several evolving data sets show that, while still requiring constant processing time and memory, AUE is more accurate than AWE.