The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Class Noise Mitigation Through Instance Weighting
ECML '07 Proceedings of the 18th European conference on Machine Learning
Boosting random subspace method
Neural Networks
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
Supervised projection approach for boosting classifiers
Pattern Recognition
Comparison of Bagging and Boosting Algorithms on Sample and Feature Weighting
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
SemiBoost: Boosting for Semi-Supervised Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Constructing ensembles of classifiers by means of weighted instance selection
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Edited AdaBoost by weighted kNN
Neurocomputing
Reduced Reward-punishment editing for building ensembles of classifiers
Expert Systems with Applications: An International Journal
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
AdaBoost-Based Algorithm for Network Intrusion Detection
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Reducing overfitting of AdaBoost by clustering-based pruning of hard examples
Proceedings of the 7th International Conference on Ubiquitous Information Management and Communication
Hi-index | 0.01 |
Noise sensitivity is known as a key related issue of AdaBoost algorithm. Previous works exhibit that AdaBoost is prone to be overfitting in dealing with the noisy data sets due to its consistent high weights assignment on hard-to-learn instances (mislabeled instances or outliers). In this paper, a new boosting approach, named noise-detection based AdaBoost (ND-AdaBoost), is exploited to combine classifiers by emphasizing on training misclassified noisy instances and correctly classified non-noisy instances. Specifically, the algorithm is designed by integrating a noise-detection based loss function into AdaBoost to adjust the weight distribution at each iteration. A k-nearest-neighbor (k-NN) and an expectation maximization (EM) based evaluation criteria are both constructed to detect noisy instances. Further, a regeneration condition is presented and analyzed to control the ensemble training error bound of the proposed algorithm which provides theoretical support. Finally, we conduct some experiments on selected binary UCI benchmark data sets and demonstrate that the proposed algorithm is more robust than standard and other types of AdaBoost for noisy data sets.