Making large-scale support vector machine learning practical
Advances in kernel methods
Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Learning with non-positive kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the impact of objective function transformations on evolutionary and black-box algorithms
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
A support vector method for multivariate performance measures
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning structured prediction models: a large margin approach
ICML '05 Proceedings of the 22nd international conference on Machine learning
The Genetic Kernel Support Vector Machine: Description and Evaluation
Artificial Intelligence Review
Evolving kernels for support vector machine classification
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Controlling overfitting with multi-objective support vector machines
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Evolutionary kernel density regression
Expert Systems with Applications: An International Journal
Screening nonrandomized studies for medical systematic reviews: A comparative study of classifiers
Artificial Intelligence in Medicine
Hi-index | 0.00 |
In this paper we embed evolutionary computation into statistical learning theory. First, we outline the connection between large margin optimization and statistical learning and see why this paradigm is successful for many pattern recognition problems. We then embed evolutionary computation into the most prominent representative of this class of learning methods, namely into Support Vector Machines (SVM). In contrast to former applications of evolutionary algorithms to SVMs we do not only optimize the method or kernel parameters. We rather use both evolution strategies and particle swarm optimization in order to directly solve the posed constrained optimization problem. Transforming the problem into the Wolfe dual reduces the total runtime and allows the usage of kernel functions. Exploiting the knowledge about this optimization problem leads to a hybrid mutation which further decreases convergence time while classification accuracy is preserved. We will show that evolutionary SVMs are at least as accurate as their quadratic programming counterparts on six real-world benchmark data sets. The evolutionary SVM variants frequently outperform their quadratic programming competitors. Additionally, the proposed algorithm is more generic than existing traditional solutions since it will also work for non-positive semidefinite kernel functions and for several, possibly competing, performance criteria.