Space or time adaptive signal processing by neural network models
AIP Conference Proceedings 151 on Neural Networks for Computing
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Probability theory
A fast fixed-point algorithm for independent component analysis
Neural Computation
Factorizing multivariate function classes
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Blind source separation via the second characteristic function
Signal Processing
Linear geometric ICA: fundamentals and algorithms
Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Source separation in post-nonlinear mixtures
IEEE Transactions on Signal Processing
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
On the use of sparse signal decomposition in the analysis of multi-channel surface electromyograms
Signal Processing - Sparse approximations in signal and image processing
Joint low-rank approximation for extracting non-Gaussian subspaces
Signal Processing
Robust sparse component analysis based on a generalized Hough transform
EURASIP Journal on Applied Signal Processing
Signed-rank tests for location in the symmetric independent component model
Journal of Multivariate Analysis
On model identifiability in analytic postnonlinear ICA
Neurocomputing
Independent subspace analysis is unique, given irreducibility
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Post nonlinear independent subspace analysis
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Statistical analysis of sample-size effects in ICA
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
A new geometrical BSS approach for non negative sources
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Functional MRI analysis by a novel spatiotemporal ICA algorithm
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Estimating non-gaussian subspaces by characteristic functions
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Uniqueness of non-gaussian subspace analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
ICA over finite fields-Separability and algorithms
Signal Processing
Uniqueness of linear factorizations into independent subspaces
Journal of Multivariate Analysis
Hi-index | 0.00 |
The goal of blind source separation (BSS) lies in recovering the original independent sources of a mixed random vector without knowing the mixing structure. A key ingredient for performing BSS successfully is to know the indeterminacies of the problem--that is, to know how the separating model relates to the original mixing model (separability). For linear BSS, Comon (1994) showed using the Darmois-Skitovitch theorem that the linear mixing matrix can be found except for permutation and scaling In this work, a much simpler, direct proof for linear separability is given The idea is based on the fact that a random vector is independent if and only if the Hessian of its logarithmic density (resp. characteristic function) is diagonal everywhere. This property is then exploited to propose a new algorithm for performing BSS. Furthermore, first ideas of how to generalize separability results based on Hessian diagonalization to more complicated nonlinear models are studied in the setting of postnonlinear BSS.