The nature of statistical learning theory
The nature of statistical learning theory
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
The covering number in learning theory
Journal of Complexity
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Mercer theorem for RKHS on noncompact sets
Journal of Complexity
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Mathematical and Computer Modelling: An International Journal
Full length article: The convergence rate of a regularized ranking algorithm
Journal of Approximation Theory
Hi-index | 0.98 |
In this paper, we study the consistency of the regularized least-square regression in a general reproducing kernel Hilbert space. We characterize the compactness of the inclusion map from a reproducing kernel Hilbert space to the space of continuous functions and show that the capacity-based analysis by uniform covering numbers may fail in a very general setting. We prove the consistency and compute the learning rate by means of integral operator techniques. To this end, we study the properties of the integral operator. The analysis reveals that the essence of this approach is the isomorphism of the square root operator.