Neural Networks and Fuzzy Systems: Theory and Applications
Neural Networks and Fuzzy Systems: Theory and Applications
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Sparse least squares support vector training in the reduced empirical feature space
Pattern Analysis & Applications
Subset based least squares subspace regression in RKHS
Neurocomputing
Sparse least squares support vector regressors trained in the reduced empirical feature space
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Fast Sparse Approximation for Least Squares Support Vector Machine
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Regressors (SVRs) usually give sparse solutions but as a regression problem becomes more difficult the number of support vectors increases and thus sparsity is lost. To solve this problem, in this paper we propose sparse support vector regressors (S-SVRs) trained in the reduced empirical feature space. First by forward selection we select the training data samples, which minimize the regression error estimated by kernel least squares. Then in the reduced empirical feature space spanned by the selected, mapped training data, we train the SVR in the dual form. Since the mapped support vectors obtained by training the S-SVR are expressed by the linear combination of the selected, mapped training data, the support vectors, in the sense that form a solution, are selected training data. By computer simulation, we compare performance of the proposed method with that of the regular SVR and that of the sparse SVR based on Cholesky factorization.