On the complexity of learning from drifting distributions
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
The complexity of learning according to two models of a drifting environment
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
On theory revision with queries
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
The Complexity of Learning According to Two Models of a Drifting Environment
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Predictive learning models for concept drift
Theoretical Computer Science - Algorithmic learning theory
On-line learning with malicious noise and the closure algorithm
Annals of Mathematics and Artificial Intelligence
Discovering Robust Knowledge from Databases that Change
Data Mining and Knowledge Discovery
Learning Changing Concepts by Exploiting the Structure of Change
Machine Learning
Theory Revision with Queries: DNF Formulas
Machine Learning
Refined Time Stamps for Concept Drift Detection During Mining for Classification Rules
TSDM '00 Proceedings of the First International Workshop on Temporal, Spatial, and Spatio-Temporal Data Mining-Revised Papers
Classification of Customer Call Data in the Presence of Concept Drift and Noise
Soft-Ware 2002 Proceedings of the First International Conference on Computing in an Imperfect World
Mining Changes for Real-Life Applications
DaWaK 2000 Proceedings of the Second International Conference on Data Warehousing and Knowledge Discovery
Predictive Learning Models for Concept Drift
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
On the difficulty of approximately maximizing agreements
Journal of Computer and System Sciences
Distinctive Features of Minimization of a Risk Functional in Mass Data Sets
Cybernetics and Systems Analysis
Relevant Data Expansion for Learning Concept Drift from Sparsely Labeled Data
IEEE Transactions on Knowledge and Data Engineering
Using additive expert ensembles to cope with concept drift
ICML '05 Proceedings of the 22nd international conference on Machine learning
A Machine Learning Evaluation of an Artificial Immune System
Evolutionary Computation
ICML '06 Proceedings of the 23rd international conference on Machine learning
Applying lazy learning algorithms to tackle concept drift in spam filtering
Expert Systems with Applications: An International Journal
Neighborhood Property--Based Pattern Selection for Support Vector Machines
Neural Computation
Incremental learning and concept drift in INTHELEX
Intelligent Data Analysis
Learning drifting concepts: Example selection vs. example weighting
Intelligent Data Analysis
Using multiple windows to track concept drift
Intelligent Data Analysis
Online classification of nonstationary data streams
Intelligent Data Analysis
Boosting classifiers for drifting concepts
Intelligent Data Analysis - Knowlegde Discovery from Data Streams
An active learning system for mining time-changing data streams
Intelligent Data Analysis
Efficient instance-based learning on data streams
Intelligent Data Analysis
Real-time data mining of non-stationary data streams from sensor networks
Information Fusion
Local likelihood modeling of temporal text streams
Proceedings of the 25th international conference on Machine learning
Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts
The Journal of Machine Learning Research
LEARNING DRIFTING NEGOTIATIONS
Applied Artificial Intelligence
Info-fuzzy algorithms for mining dynamic data streams
Applied Soft Computing
ADAPTIVE MACHINE LEARNING IN DELAYED FEEDBACK DOMAINS BY SELECTIVE RELEARNING
Applied Artificial Intelligence
Incremental learning in nonstationary environments with controlled forgetting
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Adaptive Stream Mining: Pattern Learning and Mining from Evolving Data Streams
Proceedings of the 2010 conference on Adaptive Stream Mining: Pattern Learning and Mining from Evolving Data Streams
An ensemble approach for incremental learning in nonstationary environments
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Adaptive classifiers with ICI-based adaptive knowledge base management
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
On-line learning: where are we so far?
Ubiquitous knowledge discovery
On-line learning: where are we so far?
Ubiquitous knowledge discovery
Mining Recurring Concept Drifts with Limited Labeled Streaming Data
ACM Transactions on Intelligent Systems and Technology (TIST)
Probabilistic user modeling in the presence of drifting concepts
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
Tracking concept drift in malware families
Proceedings of the 5th ACM workshop on Security and artificial intelligence
New analysis and algorithm for learning with drifting distributions
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
A social approach for learning agents
Expert Systems with Applications: An International Journal
A survey on concept drift adaptation
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
In this paper we consider the problem of tracking a subset of a domain (called the target) which changes gradually over time. A single (unknown) probability distribution over the domain is used to generate random examples for the learning algorithm and measure the speed at which the target changes. Clearly, the more rapidly the target moves, the harder it is for the algorithm to maintain a good approximation of the target. Therefore we evaluate algorithms based on how much movement of the target can be tolerated between examples while predicting with accuracy ε Furthermore, the complexity of the class {\CAL H} of possible targets, as measured by d, its VC-dimension, also effects the difficulty of tracking the target concept. We show that if the problem of minimizing the number of disagreements with a sample from among concepts in a class {\CAL H} can be approximated to within a factor k, then there is a simple tracking algorithm for {\CAL H} which can achieve a probability ε of making a mistake if the target movement rate is at most a constant times epsilon^2/(k(d+k){\rm ln}{1\over \epsilon}), where d is the Vapnik-Chervonenkis dimension of {\CAL H}. Also, we show that if {\CAL H} is properly PAC-learnable, then there is an efficient (randomized) algorithm that with high probability approximately minimizes disagreements to within a factor of 7d + 1, yielding an efficient tracking algorithm for {\CAL H} which tolerates drift rates up to a constant times \epsilon^2/(d^2{\rm ln}{1\over \epsilon}). In addition, we prove complementary results for the classes of halfspaces and axis-aligned hyperrectangles showing that the maximum rate of drift that any algorithm (even with unlimited computational power) can tolerate is a constant times ε2/d.