Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Multiclass Object Recognition with Sparse, Localized Features
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Dictionary learning based on Laplacian score in sparse coding
MLDM'11 Proceedings of the 7th international conference on Machine learning and data mining in pattern recognition
DCPE co-training for classification
Neurocomputing
Underdetermined blind source separation based on sparse representation
IEEE Transactions on Signal Processing
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries
IEEE Transactions on Image Processing
Hi-index | 0.00 |
In machine learning and pattern recognition, feature selection has been a very active topic in the literature. Unsupervised feature selection is challenging due to the lack of label which would supply the categorical information. How to define an appropriate metric is the key for feature selection. In this paper, we propose a "filter" method for unsupervised feature selection, which is based on the geometry properties of ℓ1 graph. ℓ1 graph is constructed through sparse coding. The graph establishes the relations of feature subspaces and the quality of features is evaluated by features' local preserving ability. We compare our method with classic unsupervised feature selection methods (Laplacian score and Pearson correlation) and supervised method (Fisher score) on benchmark data sets. The classification results based on support vector machine, k-nearest neighbors and multi-layer feed-forward networks demonstrate the efficiency and effectiveness of our method.