Robust trainability of single neurons
Journal of Computer and System Sciences
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
Trading convexity for scalability
ICML '06 Proceedings of the 23rd international conference on Machine learning
The Journal of Machine Learning Research
Avoiding Boosting Overfitting by Removing Confusing Samples
ECML '07 Proceedings of the 18th European conference on Machine Learning
Noisy Image Segmentation by a Robust Clustering Algorithm Based on DC Programming and DCA
ICDM '08 Proceedings of the 8th industrial conference on Advances in Data Mining: Medical Applications, E-Commerce, Marketing, and Theoretical Aspects
Gene Selection for Cancer Classification Using DCA
ADMA '08 Proceedings of the 4th international conference on Advanced Data Mining and Applications
Robust support vector machine training via convex outlier ablation
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
ODDboost: Incorporating Posterior Estimates into AdaBoost
MLDM '09 Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition
Minimum sum-of-squares clustering by DC programming and DCA
ICIC'09 Proceedings of the Intelligent computing 5th international conference on Emerging intelligent computing technology and applications
Robust feature selection for SVMs under uncertain data
ICDM'13 Proceedings of the 13th international conference on Advances in Data Mining: applications and theoretical aspects
DCA based algorithms for feature selection in semi-supervised support vector machines
MLDM'13 Proceedings of the 9th international conference on Machine Learning and Data Mining in Pattern Recognition
New and efficient DCA based algorithms for minimum sum-of-squares clustering
Pattern Recognition
Hi-index | 0.00 |
Boosting is a popular approach for building accurate classifiers. Despite the initial popular belief, boosting algorithms do exhibit overfitting and are sensitive to label noise. Part of the sensitivity of boosting algorithms to outliers and noise can be attributed to the unboundedness of the margin-based loss functions that they employ. In this paper we describe two leveraging algorithms that build on boosting techniques and employ a bounded loss function of the margin. The first algorithm interleaves the expectation maximization (EM) algorithm with boosting steps. The second algorithm decomposes a non-convex loss into a difference of two convex losses. We prove that both algorithms converge to a stationary point. We also analyze the generalization properties of the algorithms using the Rademacher complexity. We describe experiments with both synthetic data and natural data (OCR and text) that demonstrate the merits of our framework, in particular robustness to outliers.