IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
Journal of Real-Time Image Processing
Analysis of bagging ensembles of fuzzy models for premises valuation
ACIIDS'10 Proceedings of the Second international conference on Intelligent information and database systems: Part II
Hybrid ensemble approach for classification
Applied Intelligence
Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
Neural Processing Letters
Granular fuzzy models: a study in knowledge management in fuzzy modeling
International Journal of Approximate Reasoning
Boosted Pre-loaded Mixture of Experts for low-resolution face recognition
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.