Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
Model Selection and Error Estimation
Machine Learning
A few notes on statistical learning theory
Advanced lectures on machine learning
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Universal Algorithms for Learning Theory Part I : Piecewise Constant Functions
The Journal of Machine Learning Research
Neural Network Learning: Theoretical Foundations
Neural Network Learning: Theoretical Foundations
Infinite kernel learning via infinite and semi-infinite programming
Optimization Methods & Software - The International Conference on Engineering Optimization (EngOpt 2008)
Rademacher penalties and structural risk minimization
IEEE Transactions on Information Theory
Radial basis function networks and complexity regularization in function learning
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper discusses the problem of universal learning using free multivariate splines of order 1. Universal means that the learning algorithm does not involve a priori assumption on the regularity of the target function. We characterize the complexity of the space of free multivariate splines by the remarkable notion called Rademacher complexity, based on which a penalized empirical risk is constructed as an estimation of the expected risk for the candidate model. Our Rademacher complexity bounds are tight within a logarithmic factor. It is shown that the prediction rule minimizing the penalized empirical risk achieves a favorable balance between the approximation and estimation error. By resorting to the powerful techniques in approximation theory to approach the approximation error, we also derive bounds on the generalization error in terms of the sample size, for a large class of loss functions.