The nature of statistical learning theory
The nature of statistical learning theory
Pattern classification: a unified view of statistical and neural approaches
Pattern classification: a unified view of statistical and neural approaches
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Maximal margin classification for metric spaces
Journal of Computer and System Sciences - Special issue: Learning theory 2003
The infinite polynomial kernel for support vector machine
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
Hi-index | 0.01 |
Dot product kernels are a class of important kernel in the theory of support vector machine. This paper develops a method to construct the mapping that map the original data set into the high dimensional feature space, on which the inner product is defined by a dot product kernel. Our method can also be applied to the Gaussian kernels. Via this mapping, the structure of features in the feature space is easy to be observed, and the linear separability of data sets in the feature space is studied. We obtain that any two finite sets of data with empty overlap in the original space will become linearly separable in an infinite dimensional feature space, and a sufficient and necessary condition is also developed for two infinite sets of data in the original data space being linearly separable in the feature space, this condition can be applied to examine the existences and uniqueness of the hyperplane which can separate all the possible inputs correctly.