Elements of information theory
Elements of information theory
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Information theoretic combination of classifiers with application to AdaBoost
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Adaptive ensemble based learning in non-stationary environments with variable concept drift
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
Online non-stationary boosting
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Hi-index | 0.00 |
We have recently introduced an incremental learning algorithm, Learn + + .NSE, for N on-S tationary E nvironments, where the data distribution changes over time due to concept drift. Learn + + .NSE is an ensemble of classifiers approach, training a new classifier on each consecutive batch of data that become available, and combining them through an age-adjusted dynamic error based weighted majority voting. Prior work has shown the algorithm's ability to track gradually changing environments as well as its ability to retain former knowledge in cases of cyclical or recurring data by retaining and appropriately weighting all classifiers generated thus far. In this contribution, we extend the analysis of the algorithm to more challenging environments experiencing varying drift rates; but more importantly we present preliminary results on the ability of the algorithm to accommodate addition or subtraction of classes over time. Furthermore, we also present comparative results of a variation of the algorithm that employs an error-based pruning in cyclical environments.