The Strength of Weak Learnability
Machine Learning
An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
A unified theory of radial basis functions Native Hilbert spaces for radial basis functions II
Journal of Computational and Applied Mathematics - Special issue on numerical analysis in the 20th century vol. 1: approximation theory
AI Game Programming Wisdom
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Model Selection and Error Estimation
Machine Learning
The Consistency of Greedy Algorithms for Classification
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A Consistent Strategy for Boosting Algorithms
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
An introduction to boosting and leveraging
Advanced lectures on machine learning
The Journal of Machine Learning Research
Data-dependent margin-based generalization bounds for classification
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Minimax nonparametric classification .I. Rates of convergence
IEEE Transactions on Information Theory
Sequential greedy approximation for certain convex optimization problems
IEEE Transactions on Information Theory
Statistical Analysis of Some Multi-Category Large Margin Classification Methods
The Journal of Machine Learning Research
Some Theory for Generalized Boosting Algorithms
The Journal of Machine Learning Research
Subset ranking using regression
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Hi-index | 0.00 |
Many regression and classification algorithms proposed over the years can be described as greedy procedures for the stagewise minimization of an appropriate cost function. Some examples include additive models, matching pursuit, and boosting. In this work we focus on the classification problem, for which many recent algorithms have been proposed and applied successfully. For a specific regularized form of greedy stagewise optimization, we prove consistency of the approach under rather general conditions. Focusing on specific classes of problems we provide conditions under which our greedy procedure achieves the (nearly) minimax rate of convergence, implying that the procedure cannot be improved in a worst case setting. We also construct a fully adaptive procedure, which, without knowing the smoothness parameter of the decision boundary, converges at the same rate as if the smoothness parameter were known.