The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Improved Generalization Through Explicit Optimization of Margins
Machine Learning
Neural Computation
Leveraging the margin more carefully
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Training a Support Vector Machine in the Primal
Neural Computation
Robust support vector machine training via convex outlier ablation
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Robust support vector machine with bullet hole image classification
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Robust support vector regression in the primal
Neural Networks
Maximal Discrepancy for Support Vector Machines
Neurocomputing
Indexed block coordinate descent for large-scale linear classification with limited memory
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.01 |
In this paper, we propose a novel robust support vector machine based on the smooth Ramp loss, which has strong ability of suppressing the influences of outliers. The concave-convex procedure (CCCP) is utilized to transform the associated non-convex optimization into a convex one. Then, a Newton-type algorithm is developed to solve the resulting primal optimization of robust support vector machine, and the convergence property and the complexity are discussed. Experimental results show that the proposed approach has significant robustness to outliers and yields better generalization performance than the classical support vector machines on both synthetic and real data sets.