Tuning L1-SVM hyperparameters with modified radius margin bounds and simulated annealing

  • Authors:
  • Javier Acevedo;Saturnino Maldonado;Philip Siegmann;Sergio Lafuente;Pedro Gil

  • Affiliations:
  • University of Alcala, Teoría de la señal, Alcala de Henares, Spain;University of Alcala, Teoría de la señal, Alcala de Henares, Spain;University of Alcala, Teoría de la señal, Alcala de Henares, Spain;University of Alcala, Teoría de la señal, Alcala de Henares, Spain;University of Alcala, Teoría de la señal, Alcala de Henares, Spain

  • Venue:
  • IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the design of support vector machines an important step is to select the optimal hyperparameters. One of the most used estimators of the performance is the Radius-Margin bound. Some modifications of this bound have been made to adapt it to soft margin problems, giving a convex optimization problem for the L2 soft margin formulation. However, it is still interesting to consider the L1 case due to the reduction in the support vector number. There have been some proposals to adapt the Radius-Margin bound to the L1 case, but the use of gradient descent to test them is not possible in some of them because these bounds are not differentiable. In this work we propose to use simulated annealing as a method to find the optimal hyperparameters when the bounds are not differentiable, have multiple local minima or the kernel is not differentiable with respect to its hyperparameters.