Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
A unifying review of linear Gaussian models
Neural Computation
Neural Computation
Mixing Matrix Recovery of Underdetermined Source Separation Based on Sparse Representation
CIS '07 Proceedings of the 2007 International Conference on Computational Intelligence and Security
Inferring parameters and structure of latent variable models by variational bayes
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Expectation-propagation for the generative aspect model
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Variational Bayes for generalized autoregressive models
IEEE Transactions on Signal Processing
Variational learning for Gaussian mixture models
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Detectors for discrete-time signals in non-Gaussian noise
IEEE Transactions on Information Theory
De-noising by soft-thresholding
IEEE Transactions on Information Theory
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a variational Bayes expectation maximization algorithm for time series based on Attias@? variational Bayesian theory. The proposed algorithm is applied in the blind source separation (BSS) problem to estimate both the source signals and the mixing matrix for the optimal model structure. The distribution of the mixing matrix is assumed to be a matrix Gaussian distribution due to the correlation of its elements and the inverse covariance of the sensor noise is assumed to be Wishart distributed for the correlation between sensor noises. The mixture of Gaussian model is used to approximate the distribution of each independent source. The rules to update the posterior hyperparameters and the posterior of the model structure are obtained. The optimal model structure is selected as the one with largest posterior. The source signals and mixing matrix are estimated by applying LMS and MAP estimators to the posterior distributions of the hidden variables and the model parameters respectively for the optimal structure. The proposed algorithm is tested with synthetic data. The results show that: (1) the logarithm posterior of the model structure increases with the accuracy of the posterior mixing matrix; (2) the accuracies of the prior mixing matrix, the estimated mixing matrix, and the estimated source signals increase with the logarithm posterior of the model structure. This algorithm is applied to Magnetoencephalograph data to localize the source of the equivalent current dipoles.