The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Machine Learning
Online nonparametric discriminant analysis for incremental subspace learning and recognition
Pattern Analysis & Applications - Special Issue: Non-parametric distance-based classification techniques and their applications
Support Vector Machines for Pattern Classification
Support Vector Machines for Pattern Classification
Convergence improvement of active set training for support vector regressors
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We discuss feature extraction by support vector machines (SVMs). Because the coefficient vector of the hyperplane is orthogonal to the hyperplane, the vector works as a projection vector. To obtain more projection vectors that are orthogonal to the already obtained projection vectors, we train the SVM in the complementary space of the space spanned by the already obtained projection vectors. This is done by modifying the kernel function. We demonstrate the validity of this method using two-class benchmark data sets.