Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Video Event Recognition Using Kernel Methods with Multilevel Temporal Alignment
IEEE Transactions on Pattern Analysis and Machine Intelligence
Particle Swarm Model Selection
The Journal of Machine Learning Research
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
The fully informed particle swarm: simpler, maybe better
IEEE Transactions on Evolutionary Computation
Comprehensive learning particle swarm optimizer for global optimization of multimodal functions
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Information Technology in Biomedicine
Rank-One Projections With Adaptive Margins for Face Recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
On the optimal parameter choice for ν-support vector machines
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Image Processing
Nonlinear Discriminant Analysis on Embedded Manifold
IEEE Transactions on Circuits and Systems for Video Technology
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A hybrid PSO-FSVM model and its application to imbalanced classification of mammograms
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Hi-index | 0.02 |
Parameter settings of support vector machine (SVM) have a great influence on its performance. Grid search combining with cross-validation and numerical methods by minimizing some generalization error bounds are two usually adopted methods to tune the multiple parameters in SVM. However, the grid search is often time-consuming, especially when dealing with multiple parameters while the numerical methods are very sensitive to the initial value of the parameters. In this paper, we present a hybrid strategy to combine a comprehensive learning particle swarm optimizer (CLPSO) with Broyden-Fletcher-Goldfarb-Shanno (BFGS) method for effectively tuning the SVM parameters based on the generalization bounds. Rather than locating a single local optimum, the hybrid method can identify multiple local optima of the generalization bounds, which can greatly improve the stability of the parameter settings. The experimental results show that the proposed method can efficiently tune the parameters of both L1-SVM and L2-SVM and achieve competitive performance compared with other optimized classifiers.