Analysis of SVM regression bounds for variable ranking

  • Authors:
  • A. Rakotomamonjy

  • Affiliations:
  • LITIS Perception, Systemes et Informations INSA de Rouen Avenue de l'Université 76801 Saint Etienne du Rouvray, France

  • Venue:
  • Neurocomputing
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper addresses the problem of variable ranking for support vector regression. The ranking criteria that we proposed are based on leave-one-out bounds and some variants and for these criteria we have compared different search-space algorithms: recursive feature elimination and scaling factor optimization based on gradient-descent. All these algorithms have been compared on toy problems and real-world QSAR data sets. Results show that the radius-margin criterion is the most efficient criterion for ranking variables. Using this criterion can then lead to support vector regressor with improved error rate while using fewer variables. Our results also support the evidence that gradient-descent algorithm achieves a better variable ranking compared to backward algorithm.