Elements of information theory
Elements of information theory
Machine Learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
An introduction to variable and feature selection
The Journal of Machine Learning Research
The CMU Pose, Illumination, and Expression Database
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection and feature extraction for text categorization
HLT '91 Proceedings of the workshop on Speech and Natural Language
Local and Global Structures Preserving Projection
ICTAI '07 Proceedings of the 19th IEEE International Conference on Tools with Artificial Intelligence - Volume 02
Normalized mutual information feature selection
IEEE Transactions on Neural Networks
Unsupervised Feature Selection: Minimize Information Redundancy of Features
TAAI '10 Proceedings of the 2010 International Conference on Technologies and Applications of Artificial Intelligence
Input feature selection for classification problems
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Feature selection is of great importance in data mining tasks, especially for exploring high dimensional data. Laplacian Score, a recently proposed feature selection method, makes use of local manifold structure of samples to select features and achieves good performance. However, it ignores the global structure of samples and the selected features are of high redundancy. To address these issues, we propose a feature selection method based on local and global structure preserving, LGFS in short. LGFS first uses two graphs, nearest neighborhood graph and farthest neighborhood graph to describe the underlying local and global structure of samples, respectively. It then defines a criterion to prefer the features which have good ability on local and global structure preserving. To remove redundancy among the selected features, Extended LGFS (E-LGFS) is introduced by taking advantage of normalized mutual information to measure the dependency between a pair of features. We conduct extensive experiments on two artificial data sets, six UCI data sets and two public available face databases to evaluate LGFS and E-LGFS. The experimental results show our methods can achieve higher accuracies than other unsupervised comparing methods.