Bayesian regularization and pruning using a Laplace prior
Neural Computation
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
The Role of Occam‘s Razor in Knowledge Discovery
Data Mining and Knowledge Discovery
Covering number bounds of certain regularized linear function classes
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Feature selection, L1 vs. L2 regularization, and rotational invariance
ICML '04 Proceedings of the twenty-first international conference on Machine learning
An experimental test of Occam's razor in classification
Machine Learning
Pattern Recognition Letters
Hi-index | 0.00 |
Known are many different capacity measures for learning machines like: Vapnik-Chervonenkis dimension, covering numbers or fat dimension. In this paper we present experimental results of sample complexity estimation, taking into account rather simple learning machines linear in parameters. We show that, sample complexity can be quite different even for learning machines having the same VC-dimension. Moreover, independently from the capacity of a learning machine, the distribution of data is also significant. Experimental results are compared with known theoretical results for sample complexity and generalization bounds.