Instance-Based Learning Algorithms
Machine Learning
FAVORIT: concept formation with ageing of knowledge
Pattern Recognition Letters
C4.5: programs for machine learning
C4.5: programs for machine learning
Learning in the presence of concept drift and hidden contexts
Machine Learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
Machine Learning - Special issue on context sensitivity and concept drift
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An adaptive algorithm for learning changes in user interests
Proceedings of the eighth international conference on Information and knowledge management
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Selecting Examples for Partial Memory Learning
Machine Learning
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
Adapting to Drift in Continuous Domains (Extended Abstract)
ECML '95 Proceedings of the 8th European Conference on Machine Learning
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Accurate decision trees for mining high-speed data streams
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Incremental learning with partial instance memory
Artificial Intelligence
Systematic data selection to mine concept-drifting data streams
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Incremental Learning of Linear Model Trees
Machine Learning
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Learning drifting concepts: Example selection vs. example weighting
Intelligent Data Analysis
On-Line learning of decision trees in problems with unknown dynamics
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
First approach toward on-line evolution of association rules with learning classifier systems
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Combining Online Classification Approaches for Changing Environments
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Probabilistic user modeling in the presence of drifting concepts
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
Information Sciences: an International Journal
Hi-index | 0.00 |
In the process of concept learning, target concepts may have portions with short-term changes, other portions may support long-term changes, and yet others may not change at all. For this reason several local windows need to be handled. We suggest facing this problem, which naturally exists in the field of concept learning, by allocating windows which can adapt their size to portions of the target concept. We propose an incremental decision tree that is updated with incoming examples. Each leaf of the decision tree holds a time window and a local performance measure as the main parameter to be controlled. When the performance of a leaf decreases, the size of its local window is reduced. This learning algorithm, called OnlineTree2, automatically adjusts its internal parameters in order to face the current dynamics of the data stream. Results show that it is comparable to other batch algorithms when facing problems with no concept change, and it is better than evaluated methods in its ability to deal with concept drift when dealing with problems in which: concept change occurs at different speeds, noise may be present and, examples may arrive from different areas of the problem domain (virtual drift).