The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Reduced Rank Kernel Ridge Regression
Neural Processing Letters
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Text classification: A least square support vector machine approach
Applied Soft Computing
Breast cancer diagnosis using least square support vector machine
Digital Signal Processing
Recursive reduced least squares support vector regression
Pattern Recognition
Optimized fixed-size kernel models for large data sets
Computational Statistics & Data Analysis
Feature selection in the Laplacian support vector machine
Computational Statistics & Data Analysis
A modified support vector machine and its application to image segmentation
Image and Vision Computing
Multikernel semiparametric linear programming support vector regression
Expert Systems with Applications: An International Journal
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
SMO-based pruning methods for sparse least squares support vector machines
IEEE Transactions on Neural Networks
Fast Sparse Approximation for Least Squares Support Vector Machine
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The optimality of the kernel number and kernel centers plays a significant role in determining the approximation power of nearly all kernel methods. However, the process of choosing optimal kernels is always formulated as a global optimization task, which is hard to accomplish. Recently, an improved algorithm called recursive reduced least squares support vector regression (IRR-LSSVR) was proposed for establishing a global nonparametric offline model. IRR-LSSVR demonstrates a significant advantage in choosing representing support vectors compared with others. Inspired by the IRR-LSSVR, a new online adaptive parametric kernel method called Weights Varying Least Squares Support Vector Regression (WV-LSSVR) is proposed in this paper using the same type of kernels and the same centers as those used in the IRR-LSSVR. Furthermore, inspired by the multikernel semiparametric support vector regression, the effect of the kernel extension is investigated in a recursive regression framework, and a recursive kernel method called Gaussian Process Kernel Least Squares Support Vector Regression (GPK-LSSVR) is proposed using a compound kernel type which is recommended for Gaussian process regression. Numerical experiments on benchmark data sets confirm the validity and effectiveness of the presented algorithms. The WV-LSSVR algorithm shows higher approximation accuracy than the recursive parametric kernel method using the centers calculated by the k-means clustering approach. The extended recursive kernel method (i.e. GPK-LSSVR) has not shown any advantage in terms of global approximation accuracy when validating the test data set without real-time updates, but it can increase modeling accuracy if real-time identification is involved.