Methods of forward feature selection based on the aggregation of classifiers generated by single attribute

  • Authors:
  • Linkai Luo;Lingjun Ye;Meixiang Luo;Dengfeng Huang;Hong Peng;Fan Yang

  • Affiliations:
  • Department of Automation, Xiamen University, Xiamen 361005, PR China;Department of Automation, Xiamen University, Xiamen 361005, PR China;Department of Automation, Xiamen University, Xiamen 361005, PR China;Department of Automation, Xiamen University, Xiamen 361005, PR China;Department of Automation, Xiamen University, Xiamen 361005, PR China;Department of Automation, Xiamen University, Xiamen 361005, PR China

  • Venue:
  • Computers in Biology and Medicine
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Compared to backward feature selection (BFS) method in gene expression data analysis, forward feature selection (FFS) method can obtain an expected feature subset with less iteration. However, the number of FFS method is considerably less than that of BFS method. More efficient FFS methods need to be developed. In this paper, two FFS methods based on the pruning of the classifier ensembles generated by single attribute are proposed for gene selection. The main contributions are as follows: (1) a new loss function, p-insensitive loss function, is proposed to overcome the disadvantage of the margin Euclidean distance loss function in the pruning of classifier ensembles; (2) two FFS methods based on the margin Euclidean distance loss function and the p-insensitive loss function, named as FFS-ACSA1 and FFS-ACSA2 respectively, are proposed; (3) the comparison experiments on four gene expression datasets show that FFS-ACSA2 obtains the best results among three FFS methods (i.e. signal-to-noise ratio (SNR), FFS-ACSA1 and FFS-ACSA2), and is competitive to the famous support vector machine-based recursive feature elimination (SVM-RFE), while FFS-ACSA1 is unstable.