Machine Learning
SIAM Review
Support Vector Machines for 3D Object Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
On the Nesterov--Todd Direction in Semidefinite Programming
SIAM Journal on Optimization
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Support Vector Machines for Texture Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
MARK: a boosting algorithm for heterogeneous kernel models
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Face Recognition by Support Vector Machines
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
More efficiency in multiple kernel learning
Proceedings of the 24th international conference on Machine learning
MultiK-MHKS: A Novel Multiple Kernel Learning Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Localized multiple kernel learning
Proceedings of the 25th international conference on Machine learning
Proceedings of the 25th international conference on Machine learning
IEEE Transactions on Information Technology in Biomedicine
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Data Mining with Computational Intelligence
Data Mining with Computational Intelligence
Employing multiple-kernel support vector machines for counterfeit banknote recognition
Applied Soft Computing
A multiple-kernel support vector regression approach for stock market price forecasting
Expert Systems with Applications: An International Journal
Support Vector Machine Training for Improved Hidden Markov Modeling
IEEE Transactions on Signal Processing
Support vector machines for spam categorization
IEEE Transactions on Neural Networks
The evidence framework applied to support vector machines
IEEE Transactions on Neural Networks
Efficient hyperkernel learning using second-order cone programming
IEEE Transactions on Neural Networks
Hi-index | 12.05 |
Support vector machines (SVMs) have been broadly applied to classification problems. However, a successful application of SVMs depends heavily on the determination of the right type and suitable hyperparameter settings of kernel functions. Recently, multiple-kernel learning (MKL) algorithms have been developed to deal with these issues by combining different kernels together. The weight with each kernel in the combination is obtained through learning. Lanckriet et al. proposed a way of deriving the weights by transforming the learning into a semidefinite programming (SDP) problem with a transduction setting. However, the amount of time and space required by this method is demanding. In this paper, we reformulate the SDP problem with an induction setting and incorporate two strategies to reduce the search complexity of the learning process, based on the comments discussed in the Lanckriet et al. paper. The primal and dual forms of SDP are derived. A discussion on computation complexity is given. Experimental results obtained from synthetic and benchmark datasets show that the proposed method runs efficiently in multiple-kernel learning.