Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Model Selection and Error Estimation
Machine Learning
Data-dependent margin-based generalization bounds for classification
The Journal of Machine Learning Research
On the combinatorial representation of information
COCOON'06 Proceedings of the 12th annual international conference on Computing and Combinatorics
On the complexity of constrained VC-classes
Discrete Applied Mathematics
Hi-index | 0.00 |
In machine-learning, maximizing the sample margin can reduce the learning generalization error. Samples on which the target function has a large margin (γ) convey more information since they yield more accurate hypotheses. Let X be a finite domain and S denote the set of all samples S ⊆ X of fixed cardinality m. Let H be a class of hypotheses h on X. A hyperconcept h' is defined as an indicator function for a set A ⊆ S of all samples on which the corresponding hypothesis h has a margin of at least γ. An estimate on the complexity of the class H' of hyperconcepts h' is obtained with explicit dependence on γ, the pseudodimension of H and m.