Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Solving bicriteria 0-1 knapsack problems using a labeling algorithm
Computers and Operations Research
Risk Minimization and Minimum Description for Linear Discriminant Functions
INFORMS Journal on Computing
Use of a genetic heritage for solving the assignment problem with two objectives
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
It proposed an idea of using support vector machines (SVMs) to learn the efficient set of a multiple objective discrete optimization (MODO) problem. We conjecture that a surface generated by SVM could provide a good approximation of the efficient set. As the efficient set is learned at a single SVM implementation by using a group of seeds that symbolize efficient and dominated solutions. To be able to observe whether learning the efficient set via SVMs might have practical implications, we incorporate the SVM-induced efficient set into a GA as a fitness function. We implement our SVM-guided GA on the multiple objective knapsack and assignment problems. We observe that using SVM improves the performance of the GA compared to a benchmark distance based fitness function and may provide competitive results. Our approach is a general one and can be applied to any MODO problem with any number of objective functions.