The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Support vector density estimation
Advances in kernel methods
Semiparametric support vector and linear programming machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Adaptive simplification of solution for support vector machine
Pattern Recognition
Local prediction of non-linear time series using support vector regression
Pattern Recognition
Improvements to the SMO algorithm for SVM regression
IEEE Transactions on Neural Networks
A bottom-up method for simplifying support vector solutions
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Sparse multikernel support vector regression machines trained by active learning
Expert Systems with Applications: An International Journal
Pruning least objective contribution in KMSE
Neurocomputing
Hi-index | 12.05 |
In many real life realms, many unknown systems own different data trends in different regions, i.e., some parts are steep variations while other parts are smooth variations. If we utilize the conventional kernel learning algorithm, viz. the single kernel linear programming support vector regression, to identify these systems, the identification results are usually not very good. Hence, we exploit the nonlinear mappings induced from the kernel functions as the admissible functions to construct a novel multikernel semiparametric predictor, called as MSLP-SVR, to improve the regression effectiveness. The experimental results on the synthetic and the real-world data sets corroborate the efficacy and validity of our proposed MSLP-SVR. Meantime, compared with other multikernel linear programming support vector algorithm, ours also takes advantages. In addition, although the MSLP-SVR is proposed in the regression domain, it can also be extended to classification problems.