Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Generalization error estimates and training data valuation
Generalization error estimates and training data valuation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Survey of Outlier Detection Methodologies
Artificial Intelligence Review
Pruning Training Sets for Learning of Object Categories
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Linear Asymmetric Classifier for cascade detectors
ICML '05 Proceedings of the 22nd international conference on Machine learning
Enhancing Data Analysis with Noise Removal
IEEE Transactions on Knowledge and Data Engineering
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Trust region Newton methods for large-scale logistic regression
Proceedings of the 24th international conference on Machine learning
Hi-index | 0.00 |
Object detection is often formulated as a binary classification task with supervised learning that involves training datasets. Noisy samples, including mislabeled samples and ``hard-to-learn" samples, are usually found in training datasets. Such samples have a detrimental effect on the generalization performance of trained classifiers and are required to be pruned. In this paper, we propose a novel data pruning algorithm that is based on recursive Bayes approach and AdaBoost. Recursive Bayes approach increases the confidence of predictions in every iteration, while AdaBoost minimizes the number of predictions that have low confidence. Extensive experiments on real datasets show the effectiveness of the proposed algorithm in identifying and pruning noisy samples from training datasets and concurrently improving the performance of classification and object detection.