Natural gradient works efficiently in learning
Neural Computation
Efficient source adaptivity in independent component analysis
IEEE Transactions on Neural Networks
Independent component analysis based on nonparametric density estimation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In the framework of natural gradient, a novel Kurtosis-Dependent Parameterized Independent Component Analysis (KDPICA) algorithm is proposed, which can separate the mixture of super- and sub-Gaussian sources. Two kinds of new probability density models are proposed, which can provide wider ranges especially for sub-Gaussian kurtosis. According to kurtosis value of source and whitening, model parameters are adaptively calculated which can be used to estimate super- and sub-Gaussian source distributions and its corresponding score functions directly. According to stability analysis, the ranges of model parameters are fixed which confirm KDPICA algorithm stable. The experiment shows the proposed algorithm has better performance than some proposed algorithms.