A Multibody Factorization Method for Independently Moving Objects
International Journal of Computer Vision
Mixtures of probabilistic principal component analyzers
Neural Computation
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Global Optimization
Acquiring Linear Subspaces for Face Recognition under Variable Lighting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Principal Component Analysis (GPCA)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Face Recognition via Sparse Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering appearances of objects under varying illumination conditions
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Robust principal component analysis?
Journal of the ACM (JACM)
Multi-subspace representation and discovery
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
Optimization for Machine Learning
Optimization for Machine Learning
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
Segmentation of Multivariate Mixed Data via Lossy Data Coding and Compression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiscale Hybrid Linear Models for Lossy Image Representation
IEEE Transactions on Image Processing
Online group feature selection
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
This paper studies the subspace segmentation problem which aims to segment data drawn from a union of multiple linear subspaces. Recent works by using sparse representation, low rank representation and their extensions attract much attention. If the subspaces from which the data drawn are independent or orthogonal, they are able to obtain a block diagonal affinity matrix, which usually leads to a correct segmentation. The main differences among them are their objective functions. We theoretically show that if the objective function satisfies some conditions, and the data are sufficiently drawn from independent subspaces, the obtained affinity matrix is always block diagonal. Furthermore, the data sampling can be insufficient if the subspaces are orthogonal. Some existing methods are all special cases. Then we present the Least Squares Regression (LSR) method for subspace segmentation. It takes advantage of data correlation, which is common in real data. LSR encourages a grouping effect which tends to group highly correlated data together. Experimental results on the Hopkins 155 database and Extended Yale Database B show that our method significantly outperforms state-of-the-art methods. Beyond segmentation accuracy, all experiments demonstrate that LSR is much more efficient.