Feature Selection for Unsupervised and Supervised Inference: the Emergence of Sparsity in a Weighted-based Approach

  • Authors:
  • Lior Wolf;Amnon Shashua

  • Affiliations:
  • -;-

  • Venue:
  • ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The problem of selecting a subset of relevant features in apotentially overwhelming quantity of data is classic and found inmany branches of science including - examples in computer vision,text processing and more recently bio-informatics are abundant. Inthis work we present a definition of '"relevancy" based on spectralproperties of the Affinity (or Laplacian) of the features'measurement matrix. The feature selection process is then based ona continuous ranking of the features defined by a least-squaresoptimization process. A remarkable property of the featurerelevance function is that sparse solutions for the ranking valuesnaturally emerge as a result of a "biased non-negativity" of a keymatrix in the process. As a result, a simple least-squaresoptimization process converges onto a sparse solution, i.e., aselection of a subset of features which form a local maxima overthe relevance function. The feature selection algorithm can beembedded in both unsupervised and supervised inference problems andempirical evidence show that the feature selections typicallyachieve high accuracy even when only a small fraction of thefeatures are relevant.