Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
A Classification Framework for Anomaly Detection
The Journal of Machine Learning Research
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Performance of Clustering in Hilbert Spaces
IEEE Transactions on Information Theory
Hi-index | 0.01 |
This paper investigates the generalization performance of support vector classifiers for density level detection (DLD) when the input term belongs to a separable Hilbert space. The estimate of learning rate for DLD problem is established by Rademacher average and iterative techniques, which is independent of the assumption of covering number used in the previous literature.