Discrete neural computation: a theoretical foundation
Discrete neural computation: a theoretical foundation
Improved Generalization Through Explicit Optimization of Margins
Machine Learning
Machine Learning
A re-weighting strategy for improving margins
Artificial Intelligence
On generalization bounds, projection profile, and margin distribution
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Margin distribution based bagging pruning
Neurocomputing
Hi-index | 0.00 |
Recent results in theoretical machine learning seem to suggest that nice properties of the margin distribution over a training set turns out in a good performance of a classifier. The same principle has been already used in SVM and other kernel based methods as the associated optimization problems try to maximize the minimum of these margins.In this paper, we propose a kernel based method for the direct optimization of the margin distribution (KM-OMD). The method is motivated and analyzed from a game theoretical perspective. A quite efficient optimization algorithm is then proposed. Experimental results over a standard benchmark of 13 datasets have clearly shown state-of-the-art performances.