What Is the Set of Images of an Object Under All Possible Illumination Conditions?
International Journal of Computer Vision
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Tensor Decompositions and Applications
SIAM Review
Unsupervised feature selection for multi-cluster data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-way clustering using super-symmetric non-negative tensor factorization
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
l2,1-norm regularized discriminative feature selection for unsupervised learning
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hypergraph based information-theoretic feature selection
Pattern Recognition Letters
Hi-index | 0.00 |
Most existing feature selection methods focus on ranking individual features based on a utility criterion, and select the optimal feature set in a greedy manner. However, the feature combinations found in this way do not give optimal classification performance, since they neglect the correlations among features. In an attempt to overcome this problem, we develop a novel unsupervised feature selection technique by using hypergraph spectral embedding, where the projection matrix is constrained to be a selection matrix designed to select the optimal feature subset. Specifically, by incorporating multidimensional interaction information (MII) for higher order similarities measure, we establish a novel hypergraph framework which is used for characterizing the multiple relationships within a set of samples. Thus, the structural information latent in the data can be more effectively modeled. Secondly, we derive a hypergraph embedding view of feature selection which casting the feature discriminant analysis into a regression framework that considers the correlations among features. As a result, we can evaluate joint feature combinations, rather than being confined to consider them individually, and are thus able to handle feature redundancy. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard datasets.