The Strength of Weak Learnability
Machine Learning
Machine Learning
Bias/variance analyses of mixtures-of-experts architectures
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Ensemble learning via negative correlation
Neural Networks
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
The Knowledge Engineering Review
Adaptive mixtures of local experts
Neural Computation
Artificial Intelligence Review
Mixtures of regressions with predictor-dependent mixing proportions
Computational Statistics & Data Analysis
Pattern Classification Using Ensemble Methods
Pattern Classification Using Ensemble Methods
Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning
IEEE Transactions on Knowledge and Data Engineering
Predicting stock returns by classifier ensembles
Applied Soft Computing
Combining bagging, boosting, rotation forest and random subspace methods
Artificial Intelligence Review
A novel training weighted ensemble (TWE) with application to face recognition
Applied Soft Computing
Semi-supervised ensemble classification in subspaces
Applied Soft Computing
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Bagging and Boosting Negatively Correlated Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Classification of EMG signals using combined features and soft computing techniques
Applied Soft Computing
Financial distress prediction using support vector machines: Ensemble vs. individual
Applied Soft Computing
Regularized Negative Correlation Learning for Neural Network Ensembles
IEEE Transactions on Neural Networks
Boosted Pre-loaded Mixture of Experts for low-resolution face recognition
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
Both theoretical and experimental studies have shown that combining accurate neural networks (NNs) in the ensemble with negative error correlation greatly improves their generalization abilities. Negative correlation learning (NCL) and mixture of experts (ME), two popular combining methods, each employ different special error functions for the simultaneous training of NNs to produce negatively correlated NNs. In this paper, we review the properties of the NCL and ME methods, discussing their advantages and disadvantages. Characterization of both methods showed that they have different but complementary features, so if a hybrid system can be designed to include features of both NCL and ME, it may be better than each of its basis approaches. In this study, two approaches are proposed to combine the features of both methods in order to solve the weaknesses of one method with the strength of the other method, i.e., gated-NCL (G-NCL) and mixture of negatively correlated experts (MNCE). In the first approach, G-NCL, a dynamic combiner of ME is used to combine the outputs of base experts in the NCL method. The suggested combiner method provides an efficient tool to evaluate and combine the NCL experts by the weights estimated dynamically from the inputs based on the different competences of each expert regarding different parts of the problem. In the second approach, MNCE, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables the training algorithm of ME to efficiently adjust the measure of negative correlation between the experts. This control parameter can be regarded as a regularization term added to the error function of ME to establish better balance in bias-variance-covariance trade-offs and thus improves the generalization ability. The two proposed hybrid ensemble methods, G-NCL and MNCE, are compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed methods preserve the advantages and alleviate the disadvantages of their basis approaches, offering significantly improved performance over the original methods.