SIAM Review
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Optimal control by least squares support vector machines
Neural Networks
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A fast iterative algorithm for fisher discriminant using heterogeneous kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
A statistical framework for genomic data fusion
Bioinformatics
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Kernel based support vector machine via semidefinite programming: Application to medical diagnosis
Computers and Operations Research
Efficient hyperkernel learning using second-order cone programming
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
As a kernel based method, the performance of least squares support vector machine (LS-SVM) depends on the selection of the kernel as well as the regularization parameter (Duan, Keerthi, & Poo, 2003). Cross-validation is efficient in selecting a single kernel and the regularization parameter; however, it suffers from heavy computational cost and is not flexible to deal with multiple kernels. In this paper, we address the issue of multiple kernel learning for LS-SVM by formulating it as semidefinite programming (SDP). Furthermore, we show that the regularization parameter can be optimized in a unified framework with the kernel, which leads to an automatic process for model selection. Extensive experimental validations are performed and analyzed.