Independent component analysis by general nonlinear Hebbian-like learning rules
Signal Processing - Special issue on neural networks
Hi-index | 0.00 |
The learning dynamics close to the initial conditions of an on-line Hebbian ICA algorithm has been studied. For large input dimension the dynamics can be described by a diffusion equation. A surprisingly large number of examples and unusually low initial learning rate are required to avoid a stochastic trapping state near the initial conditions. Escape from this state results in symmetry breaking and the algorithm therefore avoids trapping in plateau-like fixed points which have been observed in other learning algorithms.