Natural gradient works efficiently in learning
Neural Computation
LEARN++: an incremental learning algorithm for multilayer perceptron networks
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 06
Hi-index | 0.00 |
In this paper a new modular architecture of neural networks is designed to show that any continuous function which defined on a compact set can be approximated by a multilayer perceptrons, when the output layer activation functions are linear, and the hidden layer activation function could be chosen in the conditions of no bounded and no sigmoid. An application in econometrics forecast is proposed and analyzed, where the new function models can be added to the system one by one, so the complex system can be formed easily.