Robust mixture modelling using the t distribution
Statistics and Computing
A Framework for Robust Subspace Learning
International Journal of Computer Vision - Special Issue on Computational Vision at Brown University
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Feature selection, L1 vs. L2 regularization, and rotational invariance
ICML '04 Proceedings of the twenty-first international conference on Machine learning
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Robust probabilistic projections
ICML '06 Proceedings of the 23rd international conference on Machine learning
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
ICML '06 Proceedings of the 23rd international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.01 |
Further to our recent work on the robust L1 PCA we introduce a new version of robust PCA model based on the so-called multivariate Laplace distribution (called L1 distribution) proposed in Eltoft et al. [2006. On the multivariate Laplace distribution. IEEE Signal Process. Lett. 13(5), 300-303]. Due to the heavy tail and high component dependency characteristics of the multivariate L1 distribution, the proposed model is expected to be more robust against data outliers and fitting component dependency. Additionally, we demonstrate how a variational approximation scheme enables effective inference of key parameters in the probabilistic multivariate L1-PCA model. By doing so, a tractable Bayesian inference can be achieved based on the variational EM-type algorithm.