Mixtures of probabilistic principal component analyzers
Neural Computation
Robust mixture modelling using the t distribution
Statistics and Computing
Adapting Kernels by Variational Approach in SVM
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Feature selection, L1 vs. L2 regularization, and rotational invariance
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning Nonlinear Image Manifolds by Global Alignment of Local Linear Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Visualization of Non-vectorial Data Using Twin Kernel Embedding
AIDM '06 Proceedings of the International Workshop on on Integrating AI and Data Mining
Kernel laplacian eigenmaps for visualization of non-vectorial data
AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
L1 LASSO Modeling and Its Bayesian Inference
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Hi-index | 0.01 |
Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM-type algorithm.