The Strength of Weak Learnability
Machine Learning
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
BoosTexter: A Boosting-based Systemfor Text Categorization
Machine Learning - Special issue on information retrieval
Adaptive Sampling Methods for Scaling Up Knowledge Discovery Algorithms
Data Mining and Knowledge Discovery
Robust Real-Time Face Detection
International Journal of Computer Vision
Fast Asymmetric Learning for Cascade Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
This paper presents a novel Adaboost. R training algorithm by weight trimming, which increases the training speed when dealing with large datasets and retain the forecast precision. At each iteration, the algorithm discards most of the samples with small weight and keeps only the samples whit large weight to train the weak learner. During training, only a small portion of the samples are used to train the weak learner, so the speed is increased. The method has been applied to mining safety monitoring, the experimental results show that the method has good effects for large-scale data.