Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A tutorial on support vector regression
Statistics and Computing
Knowledge-Based Kernel Approximation
The Journal of Machine Learning Research
Learning from Examples as an Inverse Problem
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Incorporating prior knowledge in support vector regression
Machine Learning
Generalizing the bias term of support vector machines
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
IEEE Transactions on Information Theory
Covering numbers for support vector machines
IEEE Transactions on Information Theory
Consistency of support vector machines and other regularized kernel classifiers
IEEE Transactions on Information Theory
Nonlinear Knowledge in Kernel Approximation
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Incorporating prior knowledge (PK) into learning methods is an effective means to improve learning performance. The consistency and error theories of PK-based methods, which are of great theoretical importance, are still far from well established. Concentrating on the PK-based kernel regression, this paper proposes a methodology of analyzing the consistency and error. This methodology converts the specific methods firstly to a unified optimization problem and then to a unified solution expression, and a general consistency and error analysis tool is proposed and applied. A few examples are given to illustrate the analysis procedure.