Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Nonparametric factor analysis with beta process priors
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Sparse Linear Identifiable Multivariate Modeling
The Journal of Machine Learning Research
The Indian Buffet Process: An Introduction and Review
The Journal of Machine Learning Research
Closed-Form EM for sparse coding and its application to source separation
LVA/ICA'12 Proceedings of the 10th international conference on Latent Variable Analysis and Signal Separation
Complex extension of infinite sparse factor analysis for blind speech separation
LVA/ICA'12 Proceedings of the 10th international conference on Latent Variable Analysis and Signal Separation
Dictionary Learning for Noisy and Incomplete Hyperspectral Images
SIAM Journal on Imaging Sciences
Infinite sparse factor analysis for blind source separation in reverberant environments
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Bayesian Nonparametrics for Microphone Array Processing
IEEE/ACM Transactions on Audio, Speech and Language Processing (TASLP)
Hi-index | 0.00 |
A nonparametric Bayesian extension of Independent Components Analysis (ICA) is proposed where observed data Y is modelled as a linear superposition, G, of a potentially infinite number of hidden sources, X. Whether a given source is active for a specific data point is specified by an infinite binary matrix, Z. The resulting sparse representation allows increased data reduction compared to standard ICA. We define a prior on Z using the Indian Buffet Process (IBP). We describe four variants of the model, with Gaussian or Laplacian priors on X and the one or two-parameter IBPs. We demonstrate Bayesian inference under these models using a Markov Chain Monte Carlo (MCMC) algorithm on synthetic and gene expression data and compare to standard ICA algorithms.