Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Kernel methods for predicting protein--protein interactions
Bioinformatics
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Frames, Reproducing Kernels, Regularization and Learning
The Journal of Machine Learning Research
Framelet kernels with applications to support vector regression and regularization networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on gait analysis
Least square regression with lp-coefficient regularization
Neural Computation
Efficient Sparse Generalized Multiple Kernel Learning
IEEE Transactions on Neural Networks
Hi-index | 7.29 |
In this paper, we propose a least squares regularized regression algorithm withl"1-regularizer in a sum space of some base hypothesis spaces. This sum space contains more functions than single base hypothesis space and therefore has stronger approximation capability. We establish an excess error bound for this algorithm under some assumptions on the kernels, the input space, the marginal distribution and the regression function. For error analysis, the excess error is decomposed into the sample error, hypothesis error and regularization error, which are estimated respectively. From the excess error bound, convergency and a learning rate can be derived by choosing a suitable value of the regularization parameter. The utility of this method is illustrated with two simulated data sets and one real life database.