Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
The nature of statistical learning theory
The nature of statistical learning theory
Linear redundancy reduction learning
Neural Networks
Jacobi Angles for Simultaneous Diagonalization
SIAM Journal on Matrix Analysis and Applications
Faithful representation of separable distributions
Neural Computation
A fast fixed-point algorithm for independent component analysis
Neural Computation
Information-theoretic approach to blind separation of sources in non-linear mixture
Signal Processing - Special issue on neural networks
Nonlinear time series analysis
Nonlinear time series analysis
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Joint Approximate Diagonalization of Positive Definite Hermitian Matrices
SIAM Journal on Matrix Analysis and Applications
A Maximum Likelihood Approach to Nonlinear Blind Source Separation
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Kernel-based nonlinear blind source separation
Neural Computation
Source separation in post-nonlinear mixtures
IEEE Transactions on Signal Processing
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
A generic framework for blind source separation in structurednonlinear models
IEEE Transactions on Signal Processing
Blind separation of mixture of independent sources through aquasi-maximum likelihood approach
IEEE Transactions on Signal Processing
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Independent Slow Feature Analysis and Nonlinear Blind Source Separation
Neural Computation
Blind separation of nonlinear mixtures by variational Bayesian learning
Digital Signal Processing
MISEP Method for Postnonlinear Blind Source Separation
Neural Computation
gpICA: a novel nonlinear ICA algorithm using geometric linearization
EURASIP Journal on Applied Signal Processing
On model identifiability in analytic postnonlinear ICA
Neurocomputing
Post nonlinear independent subspace analysis
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Hi-index | 0.00 |
We propose two methods that reduce the post-nonlinear blind source separation problem (PNL-BSS) to a linear BSS problem. The first method is based on the concept of maximal correlation: we apply the alternating conditional expectation (ACE) algorithm--a powerful technique from non-parametric statistics--to approximately invert the componentwise nonlinear functions. The second method is a Gaussianizing transformation, which is motivated by the fact that linearly mixed signals before nonlinear transformation are approximately Gaussian distributed. This heuristic, but simple and efficient procedure works as good as the ACE method. Using the framework provided by ACE, convergence can be proven. The optimal transformations obtained by ACE coincide with the sought-after inverse functions of the nonlinearities. After equalizing the nonlinearities, temporal decorrelation separation (TDSEP) allows us to recover the source signals. Numerical simulations testing "ACE-TD" and "Gauss-TD" on realistic examples are performed with excellent results.