Genetic algorithms + data structures = evolution programs (3rd ed.)
Genetic algorithms + data structures = evolution programs (3rd ed.)
Ensemble learning via negative correlation
Neural Networks
Ensembling neural networks: many could be better than all
Artificial Intelligence
Negative correlation learning and evolutionary design of neural network ensembles
Negative correlation learning and evolutionary design of neural network ensembles
A Preliminary Study on Negative Correlation Learning via Correlation-Corrected Data (NCCD)
Neural Processing Letters
An analysis of diversity measures
Machine Learning
A note on the utility of incremental learning
AI Communications
Negative correlation in incremental learning
Natural Computing: an international journal
Learn++: an incremental learning algorithm for supervised neuralnetworks
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Pareto-Based Multiobjective Machine Learning: An Overview and Case Studies
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Making use of population information in evolutionary artificialneural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An evolutionary artificial neural networks approach for breast cancer diagnosis
Artificial Intelligence in Medicine
IEEE Transactions on Neural Networks
Incremental learning by heterogeneous bagging ensemble
ADMA'10 Proceedings of the 6th international conference on Advanced data mining and applications - Volume Part II
Hi-index | 0.01 |
Negative correlation learning (NCL) is a successful approach to constructing neural network ensembles. In batch learning mode, NCL outperforms many other ensemble learning approaches. Recently, NCL has also shown to be a potentially powerful approach to incremental learning, while the advantages of NCL have not yet been fully exploited. In this paper, we propose a selective NCL (SNCL) algorithm for incremental learning. Concretely, every time a new training data set is presented, the previously trained neural network ensemble is cloned. Then the cloned ensemble is trained on the new data set. After that, the new ensemble is combined with the previous ensemble and a selection process is applied to prune the whole ensemble to a fixed size. This paper is an extended version of our preliminary paper on SNCL. Compared to the previous work, this paper presents a deeper investigation into SNCL, considering different objective functions for the selection process and comparing SNCL to other NCL-based incremental learning algorithms on two more real world bioinformatics data sets. Experimental results demonstrate the advantage of SNCL. Further, comparisons between SNCL and other existing incremental learning algorithms, such Learn++ and ARTMAP, are also presented.