Nonlinear Feature Selection by Relevance Feature Vector Machine

  • Authors:
  • Haibin Cheng;Haifeng Chen;Guofei Jiang;Kenji Yoshihira

  • Affiliations:
  • CSE Department, Michigan State University, East Lansing, MI 48824,;NEC Laboratories America, Inc., 4 Independence Way, Princeton, NJ 08540,;NEC Laboratories America, Inc., 4 Independence Way, Princeton, NJ 08540,;NEC Laboratories America, Inc., 4 Independence Way, Princeton, NJ 08540,

  • Venue:
  • MLDM '07 Proceedings of the 5th international conference on Machine Learning and Data Mining in Pattern Recognition
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support vector machine (SVM) has received much attention in feature selection recently because of its ability to incorporate kernels to discover nonlinear dependencies between features. However it is known that the number of support vectors required in SVM typically grows linearly with the size of the training data set. Such a limitation of SVM becomes more critical when we need to select a small subset of relevant features from a very large number of candidates. To solve this issue, this paper proposes a novel algorithm, called the `relevance feature vector machine'(RFVM), for nonlinear feature selection. The RFVM algorithm utilizes a highly sparse learning algorithm, the relevance vector machine (RVM), and incorporates kernels to extract important features with both linear and nonlinear relationships. As a result, our proposed approach can reduce many false alarms, e.g. including irrelevant features, while still maintain good selection performance. We compare the performances between RFVM and other state of the art nonlinear feature selection algorithms in our experiments. The results confirm our conclusions.