Classification on pairwise proximity data
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A generalized kernel approach to dissimilarity-based classification
The Journal of Machine Learning Research
Gabor-Based Kernel PCA with Fractional Power Polynomial Models for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with non-positive kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
Protein homology detection using string alignment kernels
Bioinformatics
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Kernel Discriminant Analysis for Positive Definite and Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
Guaranteed classification via regularized similarity learning
Neural Computation
Hi-index | 0.00 |
Learning with indefinite kernels attracted considerable attention in recent years due to their success in various learning scenarios. In this paper we study the asymptotic properties of the regularization kernel networks where the kernels are assumed to be indefinite, without the usual restrictions of symmetry and positive semi-definiteness as in the traditional study of kernel methods. The kernels are characterized in terms of the singular value decomposition of the corresponding kernel integrals. Two reproducing kernel Hilbert spaces are induced to characterize the approximation ability. Capacity independent error bounds are proved. Fast convergence rates are obtained both in reproducing kernel Hilbert spaces and in L^2 sense.