Ent-Boost: Boosting using entropy measures for robust object detection
Pattern Recognition Letters
Pattern Recognition
Performance of similarity measures based on histograms of local image feature vectors
Pattern Recognition Letters
On selection and combination of weak learners in AdaBoost
Pattern Recognition Letters
Face authentication using adapted local binary pattern histograms
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
Computing the Principal Local Binary Patterns for face recognition using data mining tools
Expert Systems with Applications: An International Journal
Demographic classification with local binary patterns
ICB'07 Proceedings of the 2007 international conference on Advances in Biometrics
Multi-scale local binary pattern histograms for face recognition
ICB'07 Proceedings of the 2007 international conference on Advances in Biometrics
Hi-index | 0.00 |
In this paper, we propose a novel learning method, called Jensen-Shannon Boosting (JSBoost) and demonstrate its application to object recognition. JSBoost incorporates Jensen-Shannon (JS) divergence [2] into AdaBoost learning. JS divergence is advantageous in that it provides more appropriate measure of dissimilarity between two classes and it is numerically more stable than other measures such as Kullback-Leibler (KL) divergence (see [2]). The best features are iteratively learned by maximizing the projected JS divergence, based on which best weak classifiers are derived. The weak classifiers are combined into a strong one by minimizing the recognition error. JSBoost learning is demonstrated with face object recognition using a local binary pattern (LBP) [13] based representation. JSBoost selects the best LBP features from thousands of candidate features and constructs a strong classifier based on the selected features. JSBoost empirically produces better face recognition results than other AdaBoost variants such as RealBoost [12], GentleBoost [5] and KL-Boost [7], as demonstrated by experiments.