Minimum-maximum local structure information for feature selection

  • Authors:
  • Wenjun Hu;Kup-Sze Choi;Yonggen Gu;Shitong Wang

  • Affiliations:
  • School of Information and Engineering, Huzhou Teachers College, Huzhou, Zhejiang, China and School of Digital Media, Jiangnan University, Wuxi, Jiangsu, China and Centre for Integrative Digital He ...;Centre for Integrative Digital Health, School of Nursing, Hong Kong Polytechnic University, Hong Kong, China;School of Information and Engineering, Huzhou Teachers College, Huzhou, Zhejiang, China;School of Digital Media, Jiangnan University, Wuxi, Jiangsu, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

Feature selection methods have been extensively applied in machine learning tasks, such as computer vision, pattern recognition, and data mining. These methods aim to identify a subset of the original features with high discriminating power. Among them, the feature selection technique for unsupervised tasks is more attractive since the cost to obtain the labels of the data and/or the information between classes is often high. On the other hand, the low-dimensional manifold of the ''same'' class data is usually revealed by considering the local invariance of the data structure, it may not be adequate to deal with unsupervised tasks where the class information is completely absent. In this paper, a novel feature selection method, called Minimum-maximum local structure information Laplacian Score (MMLS), is proposed to minimize the within-locality information (i.e., preserving the manifold structure of the ''same'' class data) and to maximize the between-locality information (i.e., maximizing the information between the manifold structures of the ''different'' class data) at the same time. The effectiveness of the proposed algorithm is demonstrated with experiments on classification and clustering.