Statistical analysis with missing data
Statistical analysis with missing data
The nature of statistical learning theory
The nature of statistical learning theory
Neural Computation
An Introduction to Variational Methods for Graphical Models
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning from Incomplete Data
Variational learning of clusters of undercomplete nonsymmetric independent components
The Journal of Machine Learning Research
On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models
Neural Processing Letters
Neural Computation
Variational and stochastic inference for Bayesian source separation
Digital Signal Processing
Blind separation of nonlinear mixtures by variational Bayesian learning
Digital Signal Processing
Impact of imputation of missing values on classification error for discrete data
Pattern Recognition
Separation theorem for independent subspace analysis and its consequences
Pattern Recognition
Hi-index | 0.00 |
Missing data are common in real-world data sets and are a problem for many estimation techniques. We have developed a variational Bayesian method to perform independent component analysis (ICA) on high-dimensional data containing missing entries. Missing data are handled naturally in the Bayesian framework by integrating the generative density model. Modeling the distributions of the independent sources with mixture of gaussians allows sources to be estimated with different kurtosis and skewness. Unlike the maximum likelihood approach, the variational Bayesian method automatically determines the dimensionality of the data and yields an accurate density model for the observed data without overfitting problems. The technique is also extended to the clusters of ICA and supervised classification framework.