Robust Principal Component Analysis with Adaptive Selection for Tuning Parameters
The Journal of Machine Learning Research
High breakdown estimators for principal components: the projection-pursuit approach revisited
Journal of Multivariate Analysis
Robust PCA and classification in biosciences
Bioinformatics
Fast cross-validation of high-breakdown resampling methods for PCA
Computational Statistics & Data Analysis
Principal component analysis for data containing outliers and missing elements
Computational Statistics & Data Analysis
An adjusted boxplot for skewed distributions
Computational Statistics & Data Analysis
Outliers in biometrical data: What's old, What's new
International Journal of Biometrics
Robust M-estimation of multivariate GARCH models
Computational Statistics & Data Analysis
Editorial: Special issue on variable selection and robust procedures
Computational Statistics & Data Analysis
Detecting influential observations in principal components and common principal components
Computational Statistics & Data Analysis
Detecting influential observations in Kernel PCA
Computational Statistics & Data Analysis
Robust classification for skewed data
Advances in Data Analysis and Classification
Robust multivariate association and dimension reduction using density divergences
Journal of Multivariate Analysis
Hi-index | 0.03 |
The outlier sensitivity of classical principal component analysis (PCA) has spurred the development of robust techniques. Existing robust PCA methods like ROBPCA work best if the non-outlying data have an approximately symmetric distribution. When the original variables are skewed, too many points tend to be flagged as outlying. A robust PCA method is developed which is also suitable for skewed data. To flag the outliers a new outlier map is defined. Its performance is illustrated on real data from economics, engineering, and finance, and confirmed by a simulation study.