Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Neighborhood MinMax projections
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Semi-supervised orthogonal discriminant analysis via label propagation
Pattern Recognition
Feature selection with redundancy-constrained class separability
IEEE Transactions on Neural Networks
3D human pose recovery from image by efficient visual feature selection
Computer Vision and Image Understanding
Linear discriminant dimensionality reduction
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part I
Eigenvector sensitive feature selection for spectral clustering
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
The Trace Ratio Optimization Problem for Dimensionality Reduction
SIAM Journal on Matrix Analysis and Applications
Feature relationships hypergraph for multimodal recognition
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Incremental threshold learning for classifier selection
Neurocomputing
Graph embedding based feature selection
Neurocomputing
Feature selection via joint embedding learning and sparse regression
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
l2,1-norm regularized discriminative feature selection for unsupervised learning
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hypergraph based information-theoretic feature selection
Pattern Recognition Letters
Hypergraph spectra for semi-supervised feature selection
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Massively parallel feature selection: an approach based on variance preservation
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Large Margin Subspace Learning for feature selection
Pattern Recognition
Joint clustering and feature selection
WAIM'13 Proceedings of the 14th international conference on Web-Age Information Management
Robust unsupervised feature selection
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Online group feature selection
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Joint Laplacian feature weights learning
Pattern Recognition
Supervised feature subset selection with ordinal optimization
Knowledge-Based Systems
Hi-index | 0.00 |
Fisher score and Laplacian score are two popular feature selection algorithms, both of which belong to the general graph-based feature selection framework. In this framework, a feature subset is selected based on the corresponding score (subset-level score), which is calculated in a trace ratio form. Since the number of all possible feature subsets is very huge, it is often prohibitively expensive in computational cost to search in a brute force manner for the feature subset with the maximum subset-level score. Instead of calculating the scores of all the feature subsets, traditional methods calculate the score for each feature, and then select the leading features based on the rank of these feature-level scores. However, selecting the feature subset based on the feature-level score cannot guarantee the optimum of the subset-level score. In this paper, we directly optimize the subset-level score, and propose a novel algorithm to efficiently find the global optimal feature subset such that the subset-level score is maximized. Extensive experiments demonstrate the effectiveness of our proposed algorithm in comparison with the traditional methods for feature selection.