Theory and design of adaptive filters
Theory and design of adaptive filters
Adaptive filter theory (3rd ed.)
Adaptive filter theory (3rd ed.)
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A stochastic gradient adaptive filter with gradient adaptive stepsize
IEEE Transactions on Signal Processing
A robust variable step-size LMS-type algorithm: analysis andsimulations
IEEE Transactions on Signal Processing
A homomorphic neural network for modeling and prediction
Neural Computation
An adaptive multipath mitigation filter for GNSS applications
EURASIP Journal on Advances in Signal Processing
Hi-index | 0.00 |
A backpropagation learning algorithm for feedforward neural networks withan adaptive learning rate is derived. The algorithm is based uponminimising the instantaneous output error and does not include anysimplifications encountered in the corresponding Least Mean Square (LMS)algorithms for linear adaptive filters. The backpropagation algorithmwith an adaptive learning rate, which is derived based upon the Taylorseries expansion of the instantaneous output error, is shown to exhibitbehaviour similar to that of the Normalised LMS (NLMS) algorithm. Indeed,the derived optimal adaptive learning rate of a neural network trainedby backpropagation degenerates to the learning rate of the NLMS for a linear activation function of a neuron. By continuity, the optimal adaptive learning rate for neural networks imposes additional stabilisationeffects to the traditional backpropagation learning algorithm.