Communications of the ACM
Crytographic limitations on learning Boolean formulae and finite automata
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
The Strength of Weak Learnability
Machine Learning
An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Robust Real-Time Face Detection
International Journal of Computer Vision
Boosting grammatical inference with confidence oracles
ICML '04 Proceedings of the twenty-first international conference on Machine learning
On domain-partitioning induction criteria: worst-case bounds for the worst-case based
Theoretical Computer Science
Detecting Pedestrians Using Patterns of Motion and Appearance
International Journal of Computer Vision
Vector Boosting for Rotation Invariant Multi-View Face Detection
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Experimental Study on Automatic Face Gender Classification
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Journal of Artificial Intelligence Research
Intrinsic Geometries in Learning
Emerging Trends in Visual Computing
A fast face detection method based on improved sample selection
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 5
Dual-population based coevolutionary algorithm for designing RBFNN with feature selection
Expert Systems with Applications: An International Journal
Computer Vision and Image Understanding
Proceedings of the 6th Balkan Conference in Informatics
Hi-index | 0.00 |
Scaling discrete AdaBoost to handle real-valued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream of algorithms. From the theoretical standpoint, it formally displays the original boosting property, as it brings fast improvements of the accuracy of a weak learner up to arbitrary high levels; furthermore, it brings interesting computational and numerical improvements that make it significantly easier to handle ''as is''. Conceptually speaking, it provides a new and appealing scaling to R of some well known facts about discrete (ada)boosting. Perhaps the most popular is an iterative weight modification mechanism, according to which examples have their weights decreased iff they receive the right class by the current discrete weak hypothesis. In our generalization, this property does not hold anymore, as examples that receive the right class can still be reweighted higher with real-valued weak hypotheses. From the experimental standpoint, our generalization displays the ability to produce low error formulas with particular cumulative margin distribution graphs, and it provides a nice handling of those noisy domains that represent Achilles' heel for common Adaptive Boosting algorithms.