On orthogonal linear approximation
Numerische Mathematik
Methods of L1-estimation of a covariance matrix
Computational Statistics & Data Analysis - Special issue on statistical data analysis based on the L:0I1:0E norm and relate
Median hyperplanes in normed spaces — a survey
Discrete Applied Mathematics
LAPACK Users' guide (third ed.)
LAPACK Users' guide (third ed.)
High breakdown estimators for principal components: the projection-pursuit approach revisited
Journal of Multivariate Analysis
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
ICML '06 Proceedings of the 23rd international conference on Machine learning
Principal Component Analysis Based on L1-Norm Maximization
IEEE Transactions on Pattern Analysis and Machine Intelligence
L1-norm projection pursuit principal component analysis
Computational Statistics & Data Analysis
Practical global optimization for multiview geometry
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Hi-index | 0.03 |
The L"1 norm has been applied in numerous variations of principal component analysis (PCA). An L"1-norm PCA is an attractive alternative to traditional L"2-based PCA because it can impart robustness in the presence of outliers and is indicated for models where standard Gaussian assumptions about the noise may not apply. Of all the previously-proposed PCA schemes that recast PCA as an optimization problem involving the L"1 norm, none provide globally optimal solutions in polynomial time. This paper proposes an L"1-norm PCA procedure based on the efficient calculation of the optimal solution of the L"1-norm best-fit hyperplane problem. We present a procedure called L"1-PCA^* based on the application of this idea that fits data to subspaces of successively smaller dimension. The procedure is implemented and tested on a diverse problem suite. Our tests show that L"1-PCA^* is the indicated procedure in the presence of unbalanced outlier contamination.