A fast fixed-point algorithm for independent component analysis
Neural Computation
A class of robust principal component vectors
Journal of Multivariate Analysis
Robust principal component analysis by self-organizing rules based on statistical physics approach
IEEE Transactions on Neural Networks
Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay
Neural Computation
Learning linear PCA with convex semi-definite programming
Pattern Recognition
Robust PCA for skewed data and its outlier map
Computational Statistics & Data Analysis
Robust kernel principal component analysis
Neural Computation
Approximations of the standard principal components analysis and kernel PCA
Expert Systems with Applications: An International Journal
Learning Linear and Nonlinear PCA with Linear Programming
Neural Processing Letters
PCM'06 Proceedings of the 7th Pacific Rim conference on Advances in Multimedia Information Processing
Density estimation with minimization of U-divergence
Machine Learning
Hi-index | 0.00 |
The present paper discusses robustness against outliers in a principal component analysis (PCA). We propose a class of procedures for PCA based on the minimum psi principle, which unifies various approaches, including the classical procedure and recently proposed procedures. The reweighted matrix algorithm for off-line data and the gradient algorithm for on-line data are both investigated with respect to robustness. The reweighted matrix algorithm is shown to satisfy a desirable property with local convergence, and the on-line gradient algorithm is shown to satisfy an asymptotical stability of convergence. Some procedures in the class involve tuning parameters, which control sensitivity to outliers. We propose a shape-adaptive selection rule for tuning parameters using K-fold cross validation.