An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Face Recognition Using Temporal Image Sequence
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Learning over sets using kernel principal angles
The Journal of Machine Learning Research
Binet-Cauchy Kernels on Dynamical Systems and its Application to the Analysis of Dynamic Scenes
International Journal of Computer Vision
Journal of Computer Science and Technology
Hi-index | 0.00 |
Although polynomials have proven to be useful tools to tailor generic kernels to context of specific applications, little was known about generic rules for tuning parameters (i.e. coefficients) to engineer new positive semidefinite kernels. This not only may hinder intensive exploitation of the flexibility of the kernel method, but also may cause misuse of indefinite kernels. Our main theorem presents a sufficient condition on polynomials such that applying the polynomials to known positive semidefinite kernels results in positive semidefinite kernels. The condition is very simple and therefore has a wide range of applications. In addition, in the case of degree 1, it is a necessary condition as well. We also prove the effectiveness of our theorem by showing three corollaries to it: the first one is a generalization of the polynomial kernels, while the second one presents a way to extend the principal-angle kernels, the trace kernels, and the determinant kernels. The third corollary shows corrected sufficient conditions for the codon-improved kernels and the weighted-degree kernels with shifts to be positive semidefinite.