Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Lambertian Reflectance and Linear Subspaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Principal Component Analysis (GPCA)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral Curvature Clustering (SCC)
International Journal of Computer Vision
Sampling theorems for signals from the union of finite-dimensional linear subspaces
IEEE Transactions on Information Theory
Model-based compressive sensing
IEEE Transactions on Information Theory
Recovering spikes from noisy neuronal calcium signals via structured sparse approximation
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
IEEE Transactions on Signal Processing
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
Hybrid Linear Modeling via Local Best-Fit Flats
International Journal of Computer Vision
Sparse Subspace Clustering: Algorithm, Theory, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Unions of subspaces provide a powerful generalization of single subspace models for collections of high-dimensional data; however, learning multiple subspaces from data is challenging due to the fact that segmentation--the identification of points that live in the same subspace--and subspace estimation must be performed simultaneously. Recently, sparse recovery methods were shown to provide a provable and robust strategy for exact feature selection (EFS)--recovering subsets of points from the ensemble that live in the same subspace. In parallel with recent studies of EFS with l1-minimization, in this paper, we develop sufficient conditions for EFS with a greedy method for sparse signal recovery known as orthogonal matching pursuit (OMP). Following our analysis, we provide an empirical study of feature selection strategies for signals living on unions of subspaces and characterize the gap between sparse recovery methods and nearest neighbor (NN)-based approaches. In particular, we demonstrate that sparse recovery methods provide significant advantages over NN methods and that the gap between the two approaches is particularly pronounced when the sampling of subspaces in the data set is sparse. Our results suggest that OMP may be employed to reliably recover exact feature sets in a number of regimes where NN approaches fail to reveal the subspace membership of points in the ensemble.