Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Support vector machines with different norms: motivation, formulations and results
Pattern Recognition Letters
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
A Feature Selection Newton Method for Support Vector Machine Classification
Computational Optimization and Applications
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
A Classification Framework for Anomaly Detection
The Journal of Machine Learning Research
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Rademacher chaos complexities for learning the kernel problem
Neural Computation
Least square regression with lp-coefficient regularization
Neural Computation
IEEE Transactions on Information Theory
On the Performance of Clustering in Hilbert Spaces
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this letter, we consider a density-level detection DLD problem by a coefficient-based classification framework with -regularizer and data-dependent hypothesis spaces. Although the data-dependent characteristic of the algorithm provides flexibility and adaptivity for DLD, it leads to difficulty in generalization error analysis. To overcome this difficulty, an error decomposition is introduced from an established classification framework. On the basis of this decomposition, the estimate of the learning rate is obtained by using Rademacher average and stepping-stone techniques. In particular, the estimate is independent of the capacity assumption used in the previous literature.