The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Semiparametric support vector and linear programming machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Optimal control by least squares support vector machines
Neural Networks
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Training v-support vector regression: theory and algorithms
Neural Computation
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
Reducing examples to accelerate support vector regression
Pattern Recognition Letters
Local prediction of non-linear time series using support vector regression
Pattern Recognition
Improvements to the SMO algorithm for SVM regression
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Sparse multikernel support vector regression machines trained by active learning
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
In this paper, the reducing samples strategy instead of classical @n-support vector regression (@n-SVR), viz. single kernel @n-SVR, is utilized to select training samples for admissible functions so as to curtail the computational complexity. The proposed multikernel learning algorithm, namely reducing samples based multikernel semiparametric support vector regression (RS-MSSVR), has an advantage over the single kernel support vector regression (classical @e-SVR) in regression accuracy. Meantime, in comparison with multikernel semiparametric support vector regression (MSSVR), the algorithm is also favorable for computational complexity with the comparable generalization performance. Finally, the efficacy and feasibility of RS-MSSVR are corroborated by experiments on the synthetic and real-world benchmark data sets.