C4.5: programs for machine learning
C4.5: programs for machine learning
Learning in the presence of concept drift and hidden contexts
Machine Learning
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
An Algorithm for Finding Best Matches in Logarithmic Expected Time
ACM Transactions on Mathematical Software (TOMS)
Selecting Examples for Partial Memory Learning
Machine Learning
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A streaming ensemble algorithm (SEA) for large-scale classification
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
On-line learning in changing environments with applications in supervised and unsupervised learning
Neural Networks - Computational models of neuromodulation
Online Ensemble Learning: An Empirical Study
Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Mining concept-drifting data streams using ensemble classifiers
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Incremental learning with partial instance memory
Artificial Intelligence
A Survey of Outlier Detection Methodologies
Artificial Intelligence Review
A note on the utility of incremental learning
AI Communications
Applying lazy learning algorithms to tackle concept drift in spam filtering
Expert Systems with Applications: An International Journal
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Online classification of nonstationary data streams
Intelligent Data Analysis
A framework for generating data to simulate changing environments
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
Dynamic integration of classifiers for handling concept drift
Information Fusion
Very Fast Online Learning of Highly Non Linear Problems
The Journal of Machine Learning Research
Incremental and robust learning of subspace representations
Image and Vision Computing
Learning in Environments with Unknown Dynamics: Towards more Robust Concept Learners
The Journal of Machine Learning Research
Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts
The Journal of Machine Learning Research
Adaptive active appearance model with incremental learning
Pattern Recognition Letters
Incremental learning of dynamic fuzzy neural networks for accurate system modeling
Fuzzy Sets and Systems
An incremental learning algorithm for Lagrangian support vector machines
Pattern Recognition Letters
The Impact of Diversity on Online Ensemble Learning in the Presence of Concept Drift
IEEE Transactions on Knowledge and Data Engineering
Fast Approximate kNN Graph Construction for High Dimensional Data via Recursive Lanczos Bisection
The Journal of Machine Learning Research
Handling drifts and shifts in on-line data streams with evolving fuzzy systems
Applied Soft Computing
A nonparametric classification method based on K-associated graphs
Information Sciences: an International Journal
Computer Science Review
Hi-index | 0.07 |
Non-stationary classification problems concern the changes on data distribution over a classifier lifetime. To face this problem, learning algorithms must conciliate essential, but difficult to gather, attributes like good classification performance, stability and low associated costs, like processing time and memory. This paper presents an extension of the K-associated optimal graph learning algorithm to cope with classification over non-stationary domains. The algorithm relies on a graph structure consisting of many disconnected components (subgraphs). Such graph enhances data representation by fitting locally groups of data according to a purity measure, which, in turn, quantifies the overlapping between vertices of different classes. As a result, the graph can be used to accurately estimate the probability of unlabeled data to belong to a given class. The proposed algorithm is benefited from the dynamical evolution of the graph by updating its set of components when new data is presented along time, by removing old components as new components arise. Experimental results on artificial and real domains and further statistical analysis show that the proposed algorithm is an effective solution to non-stationary classification problems.