The nature of statistical learning theory
The nature of statistical learning theory
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Support vector machine for functional data classification
Neurocomputing
Rademacher penalties and structural risk minimization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
On the Performance of Clustering in Hilbert Spaces
IEEE Transactions on Information Theory
Hi-index | 0.01 |
Based on m randomly drawn vectors in a separable Hilbert space, we investigate the consistency of the regularized regression learning algorithm by using Rademacher averages techniques. Furthermore, random projection technique for speeding up the regression learning algorithm is used. The learning rates of the regularized regression learning algorithm with random projection are established. Theoretical analysis shows that it is possible to learn directly in the projected domain. Our results reflect a tradeoff between accuracy and computational complexity when one uses regularized least square regression algorithm after random projection of the data to a finite dimensional space.