Learning time-varying concepts
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Detection of abrupt changes: theory and application
Detection of abrupt changes: theory and application
An introduction to computational learning theory
An introduction to computational learning theory
Learning in the presence of concept drift and hidden contexts
Machine Learning
Machine Learning - Special issue on context sensitivity and concept drift
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Machine Learning
Maintaining Stream Statistics over Sliding Windows
SIAM Journal on Computing
Maintaining variance and k-medians over data stream windows
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Accurate decision trees for mining high-speed data streams
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Discovering decision rules from numerical data streams
Proceedings of the 2004 ACM symposium on Applied computing
YALE: rapid prototyping for complex data mining tasks
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Catch the moment: maintaining closed frequent itemsets over a data stream sliding window
Knowledge and Information Systems
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Learning drifting concepts: Example selection vs. example weighting
Intelligent Data Analysis
Detecting change in data streams
VLDB '04 Proceedings of the Thirtieth international conference on Very large data bases - Volume 30
Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts
The Journal of Machine Learning Research
Hierarchical Clustering of Time-Series Data Streams
IEEE Transactions on Knowledge and Data Engineering
Paired Learners for Concept Drift
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Issues in evaluation of stream learning algorithms
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Tracking recurring contexts using ensemble classifiers: an application to email filtering
Knowledge and Information Systems
The Journal of Machine Learning Research
Evaluating Learning Algorithms: A Classification Perspective
Evaluating Learning Algorithms: A Classification Perspective
Fast perceptron decision tree learning from evolving data streams
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Learning decision rules from data streams
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
A survey on concept drift adaptation
ACM Computing Surveys (CSUR)
Combining block-based and online methods in learning ensembles from concept drifting data streams
Information Sciences: an International Journal
Hi-index | 0.00 |
Most streaming decision models evolve continuously over time, run in resource-aware environments, and detect and react to changes in the environment generating data. One important issue, not yet convincingly addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of prequential error with forgetting mechanisms to provide reliable error estimators. We prove that, in stationary data and for consistent learning algorithms, the holdout estimator, the prequential error and the prequential error estimated over a sliding window or using fading factors, all converge to the Bayes error. The use of prequential error with forgetting mechanisms reveals to be advantageous in assessing performance and in comparing stream learning algorithms. It is also worthwhile to use the proposed methods for hypothesis testing and for change detection. In a set of experiments in drift scenarios, we evaluate the ability of a standard change detection algorithm to detect change using three prequential error estimators. These experiments point out that the use of forgetting mechanisms (sliding windows or fading factors) are required for fast and efficient change detection. In comparison to sliding windows, fading factors are faster and memoryless, both important requirements for streaming applications. Overall, this paper is a contribution to a discussion on best practice for performance assessment when learning is a continuous process, and the decision models are dynamic and evolve over time.