Minimizing GCV/GML scores with multiple smoothing parameters via the Newton method
SIAM Journal on Scientific and Statistical Computing
Neural Computation
Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Some greedy learning algorithms for sparse regression and classification with mercer kernels
The Journal of Machine Learning Research
Accurate on-line support vector regression
Neural Computation
Regularization in the selection of radial basis function centers
Neural Computation
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A novel training algorithm for nonlinear discriminants for classification and regression in Reproducing Kernel Hilbert Spaces (RKHSs) is presented. It is shown how the overdetermined linear least squares-problem in the corresponding RKHS may be solved within a greedy forward selection scheme by updating the pseudoinverse in an order-recursive way. The described construction of the pseudoinverse gives rise to an update of the orthogonal decomposition of the reduced Gram matrix in linear time. Regularization in the spirit of Ridge regression may then easily be applied in the orthogonal space. Various experiments for both classification and regression are performed to show the competitiveness of the proposed method.