Face Recognition by Elastic Bunch Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Support vector machines applied to face recognition
Proceedings of the 1998 conference on Advances in neural information processing systems II
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Distortion Invariant Object Recognition in the Dynamic Link Architecture
IEEE Transactions on Computers
Simple Gabor feature space for invariant object recognition
Pattern Recognition Letters
FloatBoost Learning and Statistical Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Authentication Test on the BANCA Database
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
IEEE Transactions on Image Processing
ICB '09 Proceedings of the Third International Conference on Advances in Biometrics
Compact binary patterns (CBP) with multiple patch classifiers for fast and accurate face recognition
CompIMAGE'10 Proceedings of the Second international conference on Computational Modeling of Objects Represented in Images
Hi-index | 0.00 |
We proposed a novel boosting algorithm – InfoBoost. Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features are redundant. By incorporating mutual information into AdaBoost, InfoBoost fully examines the redundancy between candidate classifiers and selected classifiers. The classifiers thus selected are both accurate and non-redundant. Experimental results show that InfoBoost learned strong classifier has lower training error than AdaBoost. InfoBoost learning has also been applied to selecting discriminative Gabor features for face recognition. Even with the simple correlation distance measure and 1-NN classifier, the selected Gabor features achieve quite high recognition accuracy on the FERET database, where both expression and illumination variance are present. When only 140 features are used, InfoBoost selected features achieve 95.5% accuracy, about 2.5% higher than that achieved by AdaBoost.