Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The Fixed-Point Algorithm and Maximum Likelihood Estimation forIndependent Component Analysis
Neural Processing Letters
Kernel-based nonlinear blind source separation
Neural Computation
Beyond independent components: trees and clusters
The Journal of Machine Learning Research
MISEP - Linear and nonlinear ICA based on mutual information
The Journal of Machine Learning Research
Nonlinear independent component analysis with minimal nonlinear distortion
Proceedings of the 24th international conference on Machine learning
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Nonlinear blind source separation using kernels
IEEE Transactions on Neural Networks
Nonlinear blind source separation applied to a simple bijective model
ICISP'12 Proceedings of the 5th international conference on Image and Signal Processing
Hi-index | 0.00 |
We propose the kernel-based nonlinear independent component analysis (ICA) method, which consists of two separate steps. First, we map the data to a high-dimensional feature space and perform dimension reduction to extract the effective subspace, which was achieved by kernel principal component analysis (PCA) and can be considered as a pre-processing step. Second, we need to adjust a linear transformation in this subspace to make the outputs as statistically independent as possible. In this way, nonlinear ICA, a complex nonlinear problem, is decomposed into two relatively standard procedures. Moreover, to overcome the ill-posedness in nonlinear ICA solutions, we utilize the minimal nonlinear distortion (MND) principle for regularization, in addition to the smoothness regularizer. The MND principle states that we would prefer the nonlinear ICA solution with the mixing system of minimal nonlinear distortion, since in practice the nonlinearity in the data generation procedure is usually not very strong.