Spectral theory of self-adjoint operators in Hilbert space
Spectral theory of self-adjoint operators in Hilbert space
The nature of statistical learning theory
The nature of statistical learning theory
Vapnik-Chervonenkis dimension of neural networks
The handbook of brain theory and neural networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A few notes on statistical learning theory
Advanced lectures on machine learning
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning with non-positive kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
Protein homology detection using string alignment kernels
Bioinformatics
Shortest-Path Kernels on Graphs
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Training SVM with indefinite kernels
Proceedings of the 25th international conference on Machine learning
Introduction to Nonparametric Estimation
Introduction to Nonparametric Estimation
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Learning with boundary conditions
Neural Computation
Hi-index | 0.00 |
Reproducing kernel Kre驴n spaces are used in learning from data via kernel methods when the kernel is indefinite. In this paper, a characterization of a subset of the unit ball in such spaces is provided. Conditions are given, under which upper bounds on the estimation error and the approximation error can be applied simultaneously to such a subset. Finally, it is shown that the hyperbolic-tangent kernel and other indefinite kernels satisfy such conditions.