Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Exact simplification of support vector solutions
The Journal of Machine Learning Research
Sparseness of support vector machines
The Journal of Machine Learning Research
Large-Scale Kernel Machines (Neural Information Processing)
Large-Scale Kernel Machines (Neural Information Processing)
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Kernel-based methods, e.g., support vector machine (SVM), produce high classification performances. However, the computation becomes time-consuming as the number of the vectors supporting the classifier increases. In this paper, we propose a method for reducing the computational cost of classification by kernel-based methods while retaining the high performance. By using linear algebra of a kernel Gram matrix of the support vectors (SVs) at low computational cost, the method efficiently prunes the redundant SVs which are unnecessary for constructing the classifier. The pruning is based on the evaluation of the performance of the classifier formed by the reduced SVs in SVM. In the experiment of classification using SVM for various datasets, the feasibility of the evaluation criterion and the effectiveness of the proposed method are demonstrated.