The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Robust Real-Time Face Detection
International Journal of Computer Vision
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Jensen-Shannon Boosting Learning for Object Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Vector Boosting for Rotation Invariant Multi-View Face Detection
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Generic Object Recognition with Boosting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Computation
Distributed Cost Boosting and Bounds on Mis-classification Cost
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Ent-Boost: Boosting using entropy measures for robust object detection
Pattern Recognition Letters
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
AdaBoost with SVM-based component classifiers
Engineering Applications of Artificial Intelligence
Learning object detection from a small number of examples: the importance of good features
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Sharing features: efficient boosting procedures for multiclass object detection
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
A discriminative feature space for detecting and recognizing faces
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Computer Vision and Image Understanding
The machine knows what you are hiding: an automatic micro-expression recognition system
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Ensemble learning for generalised eigenvalues proximal support vector machines
International Journal of Computer Applications in Technology
Hi-index | 0.10 |
Despite of its great success, two key problems are still unresolved for AdaBoost algorithms: how to select the most discriminative weak learners and how to optimally combine them. In this paper, a new AdaBoost algorithm is proposed to make improvement in the two aspects. First, we select the most discriminative weak learners by minimizing a novel distance related criterion, i.e., error-degree-weighted training error metric (ETEM) together with generalization capability metric (GCM), rather than training error rate only. Second, after getting the coefficients that are set empirically, we combine the weak learners optimally by tuning the coefficients using kernel-based perceptron. Experiments with synthetic and real scene data sets show our algorithm outperforms conventional AdaBoost.