A limited memory algorithm for bound constrained optimization
SIAM Journal on Scientific Computing
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Clustering validity checking methods: part II
ACM SIGMOD Record
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
An efficient kernel matrix evaluation measure
Pattern Recognition
Gaussian kernel optimization for pattern classification
Pattern Recognition
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
The main advantage of kernel methods stems from the implicit transformation of patterns to a high-dimensional feature space, thus a choice of a kernel function and proper setting of its parameters is of crucial importance. Learning a kernel from the data requires evaluation measures to assess the quality of the kernel. In this paper current state-of-the-art kernel evaluation measures are examined and their application to the kernel optimization is verified, showing limitations of these methods. As a result, alternative evaluation measures are proposed that strive to overcome these disadvantages. Results of experiments are provided to demonstrate that the application of the optimization process that leverages introduced measures results in kernels that correspond to the classifiers that achieve significantly lower error rate.