The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Short Tutorial on Evolutionary Multiobjective Optimization
EMO '01 Proceedings of the First International Conference on Evolutionary Multi-Criterion Optimization
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Determining optimal decision model for support vector machine by genetic algorithm
CIS'04 Proceedings of the First international conference on Computational and Information Science
SVM model selection with the VC bound
CIS'04 Proceedings of the First international conference on Computational and Information Science
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
Multi-objective feature selection in music genre and style recognition tasks
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Selecting proper parameters is an important issue to extend the classification ability of Support Vector Machine (SVM), which makes SVM practically useful. Genetic Algorithm (GA) has been widely applied to solve the problem of parameters selection for SVM classification due to its ability to discover good solutions quickly for complex searching and optimization problems. However, traditional GA in this field relys on single generalization error bound as fitness function to select parameters. Since there have several generalization error bounds been developed, picking and using single criterion as fitness function seems intractable and insufficient. Motivated by the multi-objective optimization problems, this paper introduces an efficient method of parameters selection for SVM classification based on multi-objective evolutionary algorithm NSGA-II. We also introduce an adaptive mutation rate for NSGA-II. Experiment results show that our method is better than single-objective approaches, especially in the case of tiny training sets with large testing sets.