The Strength of Weak Learnability
Machine Learning
Machine Learning
Bias/variance analyses of mixtures-of-experts architectures
Neural Computation
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Organization of face and object recognition in modular neural network models
Neural Networks - Special issue on organisation of computation in brain-like systems
Ensemble learning via negative correlation
Neural Networks
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Face Detection Using Mixture of MLP Experts
Neural Processing Letters
Adaptive mixtures of local experts
Neural Computation
Computer Vision and Image Understanding
Analysis of a Plurality Voting-based Combination of Classifiers
Neural Processing Letters
A Novel Weightless Artificial Neural Based Multi-Classifier for Complex Classifications
Neural Processing Letters
Negative correlation learning and the ambiguity family of ensemble methods
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
Pattern Classification Using Ensemble Methods
Pattern Classification Using Ensemble Methods
Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks
Neural Processing Letters
Forecast Combination by Using Artificial Neural Networks
Neural Processing Letters
A new framework for small sample size face recognition based on weighted multiple decision templates
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
A Neural Network Scheme for Long-Term Forecasting of Chaotic Time Series
Neural Processing Letters
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Bagging and Boosting Negatively Correlated Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Boosted Pre-loaded Mixture of Experts for low-resolution face recognition
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.