Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
A modified back-propagation method to avoid false local minima
Neural Networks
Training neural networks with additive noise in the desired signal
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An improved backpropagation algorithm is proposed by using the Lyapunov method to minimize the absolution error function. The improved algorithm can make both error and gradient approach zero so that the local minima problem can be avoided. In addition, since the absolute error function is used, this algorithm is more robust and faster for learning than the backpropagation with the traditional square error function when target signals include some incorrect data. This paper also proposes a method of using Lyapunov stability theory to derive a learning algorithm which directly minimize the absolute error function.