Machine Learning
Incremental Induction of Decision Trees
Machine Learning
Incremental learning with partial instance memory
Artificial Intelligence
Neural Computation
A note on the utility of incremental learning
AI Communications
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Negative correlation in incremental learning
Natural Computing: an international journal
Learn++: an incremental learning algorithm for supervised neuralnetworks
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Evolving fuzzy neural networks for supervised/unsupervised onlineknowledge-based learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Classifier ensemble is a main direction of incremental learning researches, and many ensemble-based incremental learning methods have been presented. Among them, Learn++, which is derived from the famous ensemble algorithm, AdaBoost, is special. Learn++ can work with any type of classifiers, either they are specially designed for incremental learning or not, this makes Learn++ potentially supports heterogeneous base classifiers. Based on massive experiments we analyze the advantages and disadvantages of Learn++. Then a new ensemble incremental learning method, Bagging++, is presented, which is based on another famous ensemble method: Bagging. The experimental results show that Bagging ensemble is a promising method for incremental learning and heterogeneous Bagging++ has the better generalization and learning speed than other compared methods such as Learn++ and NCL.