Theory and design of adaptive filters
Theory and design of adaptive filters
Adaptive filter theory (3rd ed.)
Adaptive filter theory (3rd ed.)
Optimization by Vector Space Methods
Optimization by Vector Space Methods
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 02
Data-reusing recurrent neural adaptive filters
Neural Computation
A homomorphic neural network for modeling and prediction
Neural Computation
Hi-index | 0.00 |
The lower bounds for the a posteriori prediction error of a nonlinear predictor realized as a neural network are provided. These are obtained for a priori adaptation and a posteriori error networks with sigmoid nonlinearities trained by gradient-descent learning algorithms. A contractivity condition is imposed on a nonlinear activation function of a neuron so that the a posteriori prediction error is smaller in magnitude than the corresponding a priori one. Furthermore, an upper bound is imposed on the learning rate eta so that the approach is feasible. The analysis is undertaken for both feedforward and recurrent nonlinear predictors realized as neural networks.