COLT '90 Proceedings of the third annual workshop on Computational learning theory
Machine Learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
On the rate of convergence of regularized boosting classifiers
The Journal of Machine Learning Research
Fast rates for support vector machines
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Suboptimality of penalized empirical risk minimization in classification
COLT'07 Proceedings of the 20th annual conference on Learning theory
Hi-index | 0.00 |
We consider the problem of optimality, in a minimax sense, and adaptivity to the margin and to regularity in binary classification. We prove an oracle inequality, under the margin assumption (low noise condition), satisfied by an aggregation procedure which uses exponential weights. This oracle inequality has an optimal residual: (logM/n)κ/(2κ−1)where κ is the margin parameter, M the number of classifiers to aggregate and n the number of observations. We use this inequality first to construct minimax classifiers under margin and regularity assumptions and second to aggregate them to obtain a classifier which is adaptive both to the margin and regularity. Moreover, by aggregating plug-in classifiers (only logn), we provide an easily implementable classifier adaptive both to the margin and to regularity.