Large Margin Subspace Learning for feature selection

  • Authors:
  • Bo Liu;Bin Fang;Xinwang Liu;Jie Chen;Zhenghong Huang;Xiping He

  • Affiliations:
  • College of Computer Science, Chongqing University, Chongqing 400044, PR China and School of Computer Science and Information Engineering, Chongqing Technology and Business University, Chongqing, P ...;College of Computer Science, Chongqing University, Chongqing 400044, PR China;School of Computer Science, National University of Defense Technology, Changsha, Hunan 410073, PR China;Institut Charles Delaunay, Université de Technologie de Troyes, France;School of Computer Science and Information Engineering, Chongqing Technology and Business University, Chongqing, PR China;School of Computer Science and Information Engineering, Chongqing Technology and Business University, Chongqing, PR China

  • Venue:
  • Pattern Recognition
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Recent research has shown the benefits of large margin framework for feature selection. In this paper, we propose a novel feature selection algorithm, termed as Large Margin Subspace Learning (LMSL), which seeks a projection matrix to maximize the margin of a given sample, defined as the distance between the nearest missing (the nearest neighbor with the different label) and the nearest hit (the nearest neighbor with the same label) of the given sample. Instead of calculating the nearest neighbor of the given sample directly, we treat each sample with different (same) labels with the given sample as a potential nearest missing (hint), with the probability estimated by kernel density estimation. By this way, the nearest missing (hint) is calculated as an expectation of all different (same) class samples. In order to perform feature selection, an @?"2","1-norm is imposed on the projection matrix to enforce row-sparsity. An efficient algorithm is then proposed to solve the resultant optimization problem. Comprehensive experiments are conducted to compare the performance of the proposed algorithm with the other five state-of-the-art algorithms RFS, SPFS, mRMR, TR and LLFS, it achieves better performance than the former four. Compared with the algorithm LLFS, the proposed algorithm has a competitive performance with however a significantly faster computational.