Neural Network-Based Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Discretization: An Enabling Technique
Data Mining and Knowledge Discovery
Boosting Nested Cascade Detector for Multi-View Face Detection
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Jensen-Shannon Boosting Learning for Object Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Vector Boosting for Rotation Invariant Multi-View Face Detection
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Joint Haar-like Features for Face Detection
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Learning object detection from a small number of examples: the importance of good features
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
A discriminative feature space for detecting and recognizing faces
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Ameva: An autonomous discretization algorithm
Expert Systems with Applications: An International Journal
On selection and combination of weak learners in AdaBoost
Pattern Recognition Letters
Hi-index | 0.10 |
Recently, boosting has come to be used widely in object-detection applications because of its impressive performance in both speed and accuracy. However, learning weak classifiers which is one of the most significant tasks in using boosting is left to users. In Discrete AdaBoost, weak classifiers with binary output are too weak to boost when the training data is complex. Meanwhile, determining the appropriate number of bins for weak classifiers learned by Real AdaBoost is a challenging task because small ones might not accurately approximate the real distribution while large ones might cause over-fitting, increase computation time and waste storage space. We have developed Ent-Boost, a novel boosting scheme for efficiently learning weak classifiers using entropy measures. Class entropy information is used to automatically estimate the optimal number of bins through discretization process. Then Kullback-Leibler divergence which is the relative entropy between probability distributions of positive and negative samples is used to select the best weak classifier in the weak classifier set. Experiments showed that strong classifiers learned by Ent-Boost can achieve good performance, and achieve compact storage space. The result of building a robust face detector using Ent-Boost showed the boosting scheme to be effective.