Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
A few notes on statistical learning theory
Advanced lectures on machine learning
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Generalization error bounds for Bayesian mixture algorithms
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Hierarchic Bayesian models for kernel learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Multi-kernel regularized classifiers
Journal of Complexity
Learnability of Gaussians with Flexible Variances
The Journal of Machine Learning Research
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Probabilistic multi-class multi-kernel learning
Bioinformatics
Multi-class Discriminant Kernel Learning via Convex Programming
The Journal of Machine Learning Research
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Support Vector Machines
Learning bounds for support vector machines with learned kernels
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Fast rates for support vector machines
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Rademacher averages and phase transitions in Glivenko-Cantelli classes
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels.