Common principal components & related multivariate models
Common principal components & related multivariate models
Computational Statistics & Data Analysis
Journal of Multivariate Analysis
Outlier identification in high dimensions
Computational Statistics & Data Analysis
Principal component analysis for data containing outliers and missing elements
Computational Statistics & Data Analysis
Robust PCA for skewed data and its outlier map
Computational Statistics & Data Analysis
Robust probabilistic PCA with missing data and contribution analysis for outlier detection
Computational Statistics & Data Analysis
Editorial: Special issue on variable selection and robust procedures
Computational Statistics & Data Analysis
Hi-index | 0.03 |
Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (like the minimum volume ellipsoid, the minimum covariance determinant and the S-estimators) is not adequate for detecting atypical observations in small samples from the normal distribution. In the multi-population setting and under a common principal components model, aggregated measures based on standardized empirical influence functions are used to detect observations with a significant impact on the estimators. As in the one-population setting, the cutoff values obtained from the asymptotic distribution of those aggregated measures are not adequate for small samples. More appropriate cutoff values, adapted to the sample sizes, can be computed by using a cross-validation approach. Cutoff values obtained from a Monte Carlo study using S-estimators are provided for illustration. A real data set is also analyzed.