Novel Methods for Subset Selection with Respect to Problem Knowledge
IEEE Intelligent Systems
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
An introduction to variable and feature selection
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neighborhood Preserving Embedding
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Spectral Regression: A Unified Approach for Sparse Subspace Learning
ICDM '07 Proceedings of the 2007 Seventh IEEE International Conference on Data Mining
Natural Image Statistics and Low-Complexity Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sparsity preserving projections with applications to face recognition
Pattern Recognition
Learning with l1-graph for image analysis
IEEE Transactions on Image Processing
Unsupervised feature selection for multi-cluster data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Local-Learning-Based Feature Selection for High-Dimensional Data Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
LPP solution schemes for use with face recognition
Pattern Recognition
Image clustering using local discriminant models and global integration
IEEE Transactions on Image Processing - Special section on distributed camera networks: sensing, processing, communication, and implementation
Manifold elastic net: a unified framework for sparse dimension reduction
Data Mining and Knowledge Discovery
Domain Transfer Multiple Kernel Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Identifying critical variables of principal components for unsupervised feature selection
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Feature selection via joint embedding learning and sparse regression
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Visual Event Recognition in Videos by Learning from Web Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Sparse Coding, Hypergraph Laplacian Sparse Coding, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Coarse to fine K nearest neighbor classifier
Pattern Recognition Letters
On Similarity Preserving Feature Selection
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.01 |
Feature selection (FS) methods have commonly been used as a main way to select the relevant features. In this paper, we propose a novel unsupervised FS method, i.e., locality and similarity preserving embedding (LSPE) for feature selection. Specifically, the nearest neighbor graph is firstly constructed to preserve the locality structure of data points, and then this locality structure is mapped to the reconstruction coefficients such that the similarity among these data points is preserved. Moreover, the sparsity derived by the locality is also preserved. Finally, the low dimensional embedding of the sparse reconstruction is evaluated to best preserve the locality and similarity. We impose @?"2","1-norm on the transformation matrix to achieve row-sparsity, which allows us to select relevant features and learn the embedding simultaneously. The selected features have good stability due to the locality and similarity preserving, and more importantly, they contain natural discriminating information even if no class labels are provided. We present the optimization algorithm and analysis of convergence of the proposed method. The extensive experimental results show the effectiveness of the proposed method.