Learning general Gaussian kernel hyperparameters of SVMs using optimization on symmetric positive-definite matrices manifold

  • Authors:
  • Hicham Laanaya;Fahed Abdallah;Hichem Snoussi;Cédric Richard

  • Affiliations:
  • Centre de Recherche de Royallieu, Lab. Heudiasyc, UMR CNRS 6599, BP 20529, 60205 Compiègne, France and Faculté des Sciences Rabat, Université Mohammed V-Agdal, 4 Avenue Ibn Battouta ...;Centre de Recherche de Royallieu, Lab. Heudiasyc, UMR CNRS 6599, BP 20529, 60205 Compiègne, France;Institut Charles Delaunay (FRE CNRS 2848), Université de Technologie de Troyes, 10010 Troyes, France;Institut Charles Delaunay (FRE CNRS 2848), Université de Technologie de Troyes, 10010 Troyes, France

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2011

Quantified Score

Hi-index 0.10

Visualization

Abstract

We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector machine for classification and with other methods of the state of the art on toy data and on real world data sets.