Methods of L1-estimation of a covariance matrix
Computational Statistics & Data Analysis - Special issue on statistical data analysis based on the L:0I1:0E norm and relate
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Framework for Robust Subspace Learning
International Journal of Computer Vision - Special Issue on Computational Vision at Brown University
Convex Optimization
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
ICML '06 Proceedings of the 23rd international conference on Machine learning
Learning a Mahalanobis distance metric for data clustering and classification
Pattern Recognition
Principal Component Analysis Based on L1-Norm Maximization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Retrieval based interactive cartoon synthesis via unsupervised bi-distance metric learning
MM '09 Proceedings of the 17th ACM international conference on Multimedia
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on gait analysis
Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction
IEEE Transactions on Image Processing
IEEE Transactions on Information Theory
Robust Tensor Analysis With L1-Norm
IEEE Transactions on Circuits and Systems for Video Technology
Robust tensor clustering with non-greedy maximization
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Shifted subspaces tracking on sparse outlier for motion segmentation
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
Principal Component Analysis (PCA) is one of the most important methods to handle high-dimensional data. However, the high computational complexity makes it hard to apply to the large scale data with high dimensionality, and the used l2-norm makes it sensitive to outliers. A recent work proposed principal component analysis based on l1-normmaximization, which is efficient and robust to outliers. In that work, a greedy strategy was applied due to the difficulty of directly solving the l1-norm maximization problem, which is easy to get stuck in local solution. In this paper, we first propose an efficient optimization algorithmto solve a general l1-norm maximization problem, and then propose a robust principal component analysis with non-greedy l1-norm maximization. Experimental results on real world datasets show that the nongreedy method always obtains much better solution than that of the greedy method.