Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The nature of statistical learning theory
The nature of statistical learning theory
Hi-index | 0.00 |
We study the behaviour in zero of the derivatives of the cost functionused when training non-linear neural networks. It is shown that a fairnumber of first, second and higher order derivatives vanish in zero,validating the belief that 0 is a peculiar and potentially harmfullocation. These calculations are related to practical and theoreticalaspects of neural networks training.