Discriminant Analysis of Principal Components for Face Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Principal Component Analysis Based on L1-Norm Maximization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust feature extraction via information theoretic learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Model-Based Subspace Projection Beamforming for Deep Interference Nulling
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
Linear Discriminant Analysis (LDA) is a famous supervised feature extraction method for subspace learning in computer vision and pattern recognition. In this paper, a novel method of LDA based on a new Maximum Correntropy Criterion optimization technique is proposed. The conventional LDA, which is based on L2-norm, is sensitivity to the presence of outliers. The proposed method has several advantages: first, it is robust to large outliers. Second, it is invariant to rotations. Third, it can be effectively solved by half-quadratic optimization algorithm. And in each iteration step, the complex optimization problem can be reduced to a quadratic problem that can be efficiently solved by a weighted eigenvalue optimization method. The proposed method is capable of analyzing non-Gaussian noise to reduce the influence of large outliers substantially, resulting in a robust classification. Performance assessment in several datasets shows that the proposed approach is more effectiveness to address outlier issue than traditional ones.