COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Monotonicity and connectedness in learning systems
Monotonicity and connectedness in learning systems
Distribution-Dependent Vapnik-Chervonenkis Bounds
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
Computable Shell Decomposition Bounds
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Similar Classifiers and VC Error Bounds
Similar Classifiers and VC Error Bounds
The Journal of Machine Learning Research
Quantitatively tight sample complexity bounds
Quantitatively tight sample complexity bounds
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Rademacher penalties and structural risk minimization
IEEE Transactions on Information Theory
Combinatorial shell bounds for generalization ability
Pattern Recognition and Image Analysis
Pattern Recognition and Image Analysis
A combinatorial approach to hypothesis similarity in generalization bounds
Pattern Recognition and Image Analysis
Hi-index | 0.00 |
Three general methods for obtaining exact bounds on the probability of overfitting are proposed within statistical learning theory: a method of generating and destroying sets, a recurrent method, and a blockwise method. Six particular cases are considered to illustrate the application of these methods. These are the following model sets of predictors: a pair of predictors, a layer of a Boolean cube, an interval of a Boolean cube, a monotonic chain, a unimodal chain, and a unit neighborhood of the best predictor. For the interval and the unimodal chain, the results of numerical experiments are presented that demonstrate the effects of splitting and similarity on the probability of overfitting.