Face Recognition by Elastic Bunch Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Support vector machines applied to face recognition
Proceedings of the 1998 conference on Advances in neural information processing systems II
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Distortion Invariant Object Recognition in the Dynamic Link Architecture
IEEE Transactions on Computers
Improving Algorithms for Boosting
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Simple Gabor feature space for invariant object recognition
Pattern Recognition Letters
FloatBoost Learning and Statistical Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Authentication Test on the BANCA Database
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
Face recognition using ada-boosted gabor features
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
IEEE Transactions on Image Processing
Facial affect recognition using regularized discriminant analysis-based algorithms
EURASIP Journal on Advances in Signal Processing - Special issue on video analysis for human behavior understanding
Privacy-by-design rules in face recognition system
Neurocomputing
Hi-index | 0.00 |
Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features, or weak classifiers, are redundant. By incorporating mutual information into AdaBoost, we propose an improved boosting algorithm in this paper. The proposed method fully examines the redundancy between candidate classifiers and selected classifiers. The classifiers thus selected are both accurate and non-redundant. Experimental results show that the strong classifier learned using the proposed algorithm achieves a lower training error rate than AdaBoost. The proposed algorithm has also been applied to select discriminative Gabor features for face recognition. Even with the simple correlation distance measure and 1-NN classifier, the selected Gabor features achieve quite high recognition accuracy on the FERET database, where both expression and illumination variance exists. When only 140 features are used, the selected features achieve as high as 95.5% accuracy, which is about 2.5% higher than that of features selected by AdaBoost.