Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Ensemble learning via negative correlation
Neural Networks
Negative correlation learning and evolutionary design of neural network ensembles
Negative correlation learning and evolutionary design of neural network ensembles
Multi-objective genetic algorithms: Problem difficulties and construction of test problems
Evolutionary Computation
Performance Analysis of Classifier Ensembles: Neural Networks Versus Nearest Neighbor Rule
IbPRIA '07 Proceedings of the 3rd Iberian conference on Pattern Recognition and Image Analysis, Part I
Computational Statistics & Data Analysis
Artificial Intelligence Review
Bagging with asymmetric costs for misclassified and correctly classified examples
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
Neural Processing Letters
Hi-index | 0.00 |
We study the formal basis behind Negative Correlation (NC) Learning, an ensemble technique developed in the evolutionary computation literature. We show that by removing an assumption made in the original work, NC can be shown to be a derivative technique of the Ambiguity decomposition by Krogh and Vedelsby. From this formalisation, we calculate parameter bounds, and show significant improvements in empirical tests. We hypothesize that the reason for its success lies in rescaling an estimate of ensemble covariance; then show that during this rescaling, NC varies smoothly between a single neural network and an ensemble system. Finally we unify several other works in the literature, all of which have exploited the Ambiguity decomposition in some way, and term them the Ambiguity Family.