The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Neural Network-Based Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Example-Based Learning for View-Based Human Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Detecting Faces in Images: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Rule-based face detection in frontal views
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
FloatBoost Learning and Statistical Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Joint Haar-like Features for Face Detection
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Ent-Boost: Boosting Using Entropy Measure for Robust Object Detection
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 02
Hi-index | 0.00 |
The training of the adaboost algorithm for face detection is time costly; it often needs days or weeks in the previous system. In this paper, we describe efficient optimization techniques and implement skills to reduce the training time. First we use some preprocessing technique to reduce the candidate features size to ten percent of the original, and then we use some implement skills to further reduce the training time. Besides these, we use double thresholds to describe each feature, which can improve the efficient of each feature, and reduce the required feature number for the final strong classifier. The experiment result show that the training of our system is hundred time faster than previous systems.