ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part III
Evolutionary optimization of regression model ensembles in steel-making process
IDEAL'11 Proceedings of the 12th international conference on Intelligent data engineering and automated learning
Neural network committees optimized with evolutionary methods for steel temperature control
ICCCI'11 Proceedings of the Third international conference on Computational collective intelligence: technologies and applications - Volume Part I
Evolutionary optimized forest of regression trees: application in metallurgy
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part I
Classifier ensemble using a heuristic learning with sparsity and diversity
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Ensemble classifier generation using non-uniform layered clustering and Genetic Algorithm
Knowledge-Based Systems
A fuzzy evolutionary framework for combining ensembles
Applied Soft Computing
Hi-index | 0.00 |
Negative Correlation Learning (NCL) [CHECK END OF SENTENCE], [CHECK END OF SENTENCE] is a neural network ensemble learning algorithm which introduces a correlation penalty term to the cost function of each individual network so that each neural network minimizes its mean-square-error (MSE) together with the correlation. This paper describes NCL in detail and observes that the NCL corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE without regularization. This insight explains that NCL is prone to overfitting the noise in the training set. The paper analyzes this problem and proposes the multiobjective regularized negative correlation learning (MRNCL) algorithm which incorporates an additional regularization term for the ensemble and uses the evolutionary multiobjective algorithm to design ensembles. In MRNCL, we define the crossover and mutation operators and adopt nondominated sorting algorithm with fitness sharing and rank-based fitness assignment. The experiments on synthetic data as well as real-world data sets demonstrate that MRNCL achieves better performance than NCL, especially when the noise level is nontrivial in the data set. In the experimental discussion, we give three reasons why our algorithm outperforms others.