Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the impact of objective function transformations on evolutionary and black-box algorithms
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
A support vector method for multivariate performance measures
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning structured prediction models: a large margin approach
ICML '05 Proceedings of the 22nd international conference on Machine learning
The Genetic Kernel Support Vector Machine: Description and Evaluation
Artificial Intelligence Review
Evolutionary learning with kernels: a generic solution for large margin problems
Proceedings of the 8th annual conference on Genetic and evolutionary computation
YALE: rapid prototyping for complex data mining tasks
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Application of MOGA Search Strategy to SVM Training Data Selection
EMO '09 Proceedings of the 5th International Conference on Evolutionary Multi-Criterion Optimization
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Multi-objective feature selection in music genre and style recognition tasks
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Recently, evolutionary computation has been successfully integrated into statistical learning methods. A Support Vector Machine (SVM) using evolution strategies for its optimization problem frequently deliver better results with respect to the optimization criterion and the prediction accuracy. Moreover, evolutionary computation allows for the efficient large margin optimization of a huge family of new kernel functions, namely non-positive semi definite kernels as the Epanechnikov kernel. For these kernel functions, evolutionary SVM even outperform other learning methods like the Relevance Vector Machine. In this paper, we will discuss another major advantage of evolutionary SVM compared to traditional SVM solutions: we can explicitly optimize the inherent trade-off between training error and model complexity by embedding multi-objective optimization into the evolutionary SVM. This leads to three advantages: first, it is no longer necessary to tune the SVM parameter C which weighs both conflicting criteria. This is a very time-consuming task for traditional SVM. Second, the shape and size of the Pareto front give interesting insights about the complexity of the learning task at hand. Finally, the user can actually see the point where overfitting occurs and can easily select a solution from the Pareto front best suiting his or her needs.