The Strength of Weak Learnability
Machine Learning
Machine Learning
Bias/variance analyses of mixtures-of-experts architectures
Neural Computation
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Combining predictors: comparison of five meta machine learning methods
Information Sciences: an International Journal
Ensemble learning via negative correlation
Neural Networks
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Neural - Network Based Measures of Confidence for Word Recognition
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97)-Volume 2 - Volume 2
Face recognition: A literature survey
ACM Computing Surveys (CSUR)
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
The Knowledge Engineering Review
Regularizing the effect of input noise injection in feedforward neural networks training
Neural Computing and Applications
A neural network based multi-classifier system for gene identification in DNA sequences
Neural Computing and Applications
Face recognition from a single image per person: A survey
Pattern Recognition
Multi-Classifier Systems: Review and a roadmap for developers
International Journal of Hybrid Intelligent Systems
Adaptive mixtures of local experts
Neural Computation
Computer Vision and Image Understanding
A new framework for small sample size face recognition based on weighted multiple decision templates
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Bagging and Boosting Negatively Correlated Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
k-nearest neighbors directed noise injection in multilayer perceptron training
IEEE Transactions on Neural Networks
On overfitting, generalization, and randomly expanded training sets
IEEE Transactions on Neural Networks
Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
Neural Processing Letters
An automatic method for construction of ensembles to time series prediction
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
A modified version of Boosted Mixture of Experts BME for low-resolution face recognition is presented in this paper. Most of the methods developed for low-resolution face recognition focused on improving the resolution of face images and/or special feature extraction methods that can deal effectively with low-resolution problem. However, we focused on the classification step of face recognition process in this paper. Using Neural Networks NN combinations is an efficient approach to deal with complex classification problems, such as the low-resolution face recognition which involves high-dimensional feature sets and highly overlapped classes. Mixture of Experts ME and boosting methods are two of the most popular and interesting NN combining methods, which have great potential for improving performance in classification. A modified combining approach based on both features of ME and boosting is presented in order to deal with this complex classification problem efficiently. Previous works [1,2] made attempts to incorporate the complementary features of boosting method in ME training algorithm to boost the performance. These approaches called Boosted Mixture of Experts BME have some drawbacks. Based on the analysis of the problems of previous approaches, some modifications are suggested in this paper. A modification in the pre-loading initialization procedure of ME is proposed to address the limitations of previous approaches and overcome them using a two stages pre-loading procedure. In our suggested approach, both the error and confidence measures are used as the difficulty criteria in boosting-based partitioning of the problem space. Regarding the nature of this approach, we call the proposed method Boosted Pre-loaded Mixture of Experts BPME. The proposed method is tested in a low-resolution face recognition problem and compared to the other variations of ME and boosting method. The experiments are conducted using low-resolution variations of two common face databases including the ORL and Yale databases. The experimental results show that BPME method has significant better recognition rates against the other compared combining methods in various tested conditions including different quality grades of face images and different sizes of the training set.