Bayesian approach to the concept drift in the pattern recognition problems
MLDM'12 Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition
New management operations on classifiers pool to track recurring concepts
DaWaK'12 Proceedings of the 14th international conference on Data Warehousing and Knowledge Discovery
An attempt to employ genetic fuzzy systems to predict from a data stream of premises transactions
SUM'12 Proceedings of the 6th international conference on Scalable Uncertainty Management
Comparison of long-term adaptivity for neural networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
SVM-based just-in-time adaptive classifiers
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
An analysis of change trends by predicting from a data stream using genetic fuzzy systems
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part I
RCD: A recurring concept drift framework
Pattern Recognition Letters
An adaptive ensemble classifier for mining concept drifting data streams
Expert Systems with Applications: An International Journal
A survey on concept drift adaptation
ACM Computing Surveys (CSUR)
Pattern Recognition Letters
Combining block-based and online methods in learning ensembles from concept drifting data streams
Information Sciences: an International Journal
Concept drift detection via competence models
Artificial Intelligence
Hi-index | 0.00 |
We introduce an ensemble of classifiers-based approach for incremental learning of concept drift, characterized by nonstationary environments (NSEs), where the underlying data distributions change over time. The proposed algorithm, named ${\rm Learn}^{++}.{\rm NSE}$, learns from consecutive batches of data without making any assumptions on the nature or rate of drift; it can learn from such environments that experience constant or variable rate of drift, addition or deletion of concept classes, as well as cyclical drift. The algorithm learns incrementally, as other members of the ${\rm Learn}^{++}$ family of algorithms, that is, without requiring access to previously seen data. ${\rm Learn}^{++}.{\rm NSE}$ trains one new classifier for each batch of data it receives, and combines these classifiers using a dynamically weighted majority voting. The novelty of the approach is in determining the voting weights, based on each classifier's time-adjusted accuracy on current and past environments. This approach allows the algorithm to recognize, and act accordingly, to the changes in underlying data distributions, as well as to a possible reoccurrence of an earlier distribution. We evaluate the algorithm on several synthetic datasets designed to simulate a variety of nonstationary environments, as well as a real-world weather prediction dataset. Comparisons with several other approaches are also included. Results indicate that ${\rm Learn}^{++}.{\rm NSE}$ can track the changing environments very closely, regardless of the type of concept drift. To allow future use, comparison and benchmarking by interested researchers, we also release our data used in this paper.