Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Advantages of Unbiased Support Vector Classifiers for Data Mining Applications
Journal of VLSI Signal Processing Systems
Trading convexity for scalability
ICML '06 Proceedings of the 23rd international conference on Machine learning
Support Vector Machines with the Ramp Loss and the Hard Margin Loss
Operations Research
Harmonic source model based on support vector machine
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
Attribit: content creation with semantic attributes
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
In this paper, we propose a general technique for solving support vector classifiers (SVCs) for an arbitrary loss function, relying on the application of an iterative reweighted least squares (IRWLS) procedure. We further show that three properties of the SVC solution can be written as conditions over the loss function. This technique allows the implementation of the empirical risk minimization (ERM) inductive principle on large margin classifiers obtaining, at the same time, very compact (in terms of number of support vectors) solutions. The improvements obtained by changing the SVC loss function are illustrated with synthetic and real data examples.