The nature of statistical learning theory
The nature of statistical learning theory
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Dynamically adapting kernels in support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization
Newton's Method for Large Bound-Constrained Optimization Problems
SIAM Journal on Optimization
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Optimizing resources in model selection for support vector machine
Pattern Recognition
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Journal of Biomedical Informatics
Classes of Kernels for Hit Definition in Compound Screening
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Short Communication: A geometric method for model selection in support vector machine
Expert Systems with Applications: An International Journal
Kernel Trees for Support Vector Machines
IEICE - Transactions on Information and Systems
Learning by local kernel polarization
Neurocomputing
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Predicting O-glycosylation sites in mammalian proteins by using SVMs
Computational Biology and Chemistry
Fast and efficient strategies for model selection of Gaussian support vector machine
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An optimization on pictogram identification for the road-sign recognition task using SVMs
Computer Vision and Image Understanding
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Simultaneous tuning of hyperparameter and parameter for support vector machines
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Experiments on kernel tree support vector machines for text categorization
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Tuning L1-SVM hyperparameters with modified radius margin bounds and simulated annealing
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Analysis of the distance between two classes for tuning SVM hyperparameters
IEEE Transactions on Neural Networks
Study on multi-label text classification based on SVM
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 1
Tuning SVM parameters by using a hybrid CLPSO-BFGS algorithm
Neurocomputing
Evolution strategies based adaptive Lp LS-SVM
Information Sciences: an International Journal
L2-SVM: Dependence on the regularization parameter
Pattern Recognition and Image Analysis
SVM based MLP neural network algorithm and application in intrusion detection
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part III
SVM model selection with the VC bound
CIS'04 Proceedings of the First international conference on Computational and Information Science
PSO-Based hyper-parameters selection for LS-SVM classifiers
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
Designing nonlinear classifiers through minimizing VC dimension bound
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Automatic face recognition by support vector machines
IWCIA'04 Proceedings of the 10th international conference on Combinatorial Image Analysis
Multi-objective model selection for support vector machines
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
A comparison of model selection methods for multi-class support vector machines
ICCSA'05 Proceedings of the 2005 international conference on Computational Science and Its Applications - Volume Part IV
Properties of the solution of L2-Support Vector Machine as a function of regularization parameter
Pattern Recognition and Image Analysis
On linear programs with linear complementarity constraints
Journal of Global Optimization
Computers and Electronics in Agriculture
Hi-index | 0.00 |
An important approach for efficient support vector machine (SVM) model selection is to use differentiable bounds of the leave-one-out (loo) error. Past efforts focused on finding tight bounds of loo (e.g., radius margin bounds, span bounds). However, their practical viability is still not very satisfactory. Duan, Keerthi, and Poo (2003) showed that radius margin bound gives good prediction for L2-SVM, one of the cases we look at. In this letter, through analyses about why this bound performs well for L2-SVM, we show that finding a bound whose minima are in a region with small loo values may be more important than its tightness. Based on this principle, we propose modified radius margin bounds for L1-SVM (the other case) where the original bound is applicable only to the hard-margin case. Our modification for L1-SVM achieves comparable performance to L2-SVM. To study whether L1- or L2-SVM should be used, we analyze other properties, such as their differentiability, number of support vectors, and number of free support vectors. In this aspect, L1-SVM possesses the advantage of having fewer support vectors. Their implementations are also different, so we discuss related issues in detail.