Computable Shell Decomposition Bounds
The Journal of Machine Learning Research
Exact combinatorial bounds on the probability of overfitting for empirical risk minimization
Pattern Recognition and Image Analysis
Combinatorial shell bounds for generalization ability
Pattern Recognition and Image Analysis
A combinatorial approach to hypothesis similarity in generalization bounds
Pattern Recognition and Image Analysis
Hi-index | 0.00 |
I present many new results on sample complexity bounds (bounds on the future error rate of arbitrary learning algorithms). Of theoretical interest are qualitative and quantitative improvements in sample complexity bounds as well as some techniques and criteria for judging the tightness of sample complexity bounds. On the practical side, I show quantitative results (with true error rate bounds sometimes less than 0.01) for decision trees and neural networks with these sample complexity bounds applied to real world problems. I also present a technique for using both sample complexity bounds and (more traditional) holdout techniques. Together, the theoretical and practical results of this thesis provide a well-founded practical method for evaluating learning algorithm performance based upon both training and testing set performance. Code for calculating these bounds is provided.