Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Feature Selection for Clustering - A Filter Solution
ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Trace ratio criterion for feature selection
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Unsupervised feature selection for multi-cluster data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction
IEEE Transactions on Image Processing
Joint clustering and feature selection
WAIM'13 Proceedings of the 14th international conference on Web-Age Information Management
Robust unsupervised feature selection
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
The problem of feature selection has aroused considerable research interests in the past few years. Traditional learning based feature selection methods separate embedding learning and feature ranking. In this paper, we introduce a novel unsupervised feature selection approach via Joint Embedding Learning and Sparse Regression (JELSR). Instead of simply employing the graph laplacian for embedding learning and then regression, we use the weight via locally linear approximation to construct graph and unify embedding learning and sparse regression to perform feature selection. By adding the l2,1-norm regularization, we can learn a sparse matrix for feature ranking. We also provide an effective method to solve the proposed problem. Compared with traditional unsupervised feature selection methods, our approach could integrate the merits of embedding learning and sparse regression simultaneously. Plenty of experimental results are provided to show the validity.