On the problem of finding the least number of features by L1-norm minimisation

  • Authors:
  • Sascha Klement;Thomas Martinetz

  • Affiliations:
  • Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany;Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany

  • Venue:
  • ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, the so-called Support Feature Machine (SFM) was proposed as a novel approach to feature selection for classification. It relies on approximating the zero-norm minimising weight vector of a separating hyperplane by optimising for its one-norm. In contrast to the L1-SVM it uses an additional constraint based on the average of data points. In experiments on artificial datasets we observe that the SFM is highly superior in returning a lower number of features and a larger percentage of truly relevant features. Here, we derive a necessary condition that the zero-norm and 1-norm solution coincide. Based on this condition the superiority can be made plausible.