Theory and design of adaptive filters
Theory and design of adaptive filters
Interference cancellation using radial basis function networks
Signal Processing
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Adaptive Filtering: Algorithms and Practical Implementation
Adaptive Filtering: Algorithms and Practical Implementation
Fast learning in networks of locally-tuned processing units
Neural Computation
Nonlinear adaptive prediction of speech with a pipelined recurrentneural network
IEEE Transactions on Signal Processing
Nonlinear adaptive prediction of nonstationary signals
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper deals with the problem of adaptation of radial basis function neural networks (RBF NN). A new RBF NN supervised training algorithm is proposed. This method possesses the distinctive properties of Lyapunov Theory-based Adaptive Filtering (LAF) in [1]-[2]. The method is different from many RBF NN training using gradient search methods. A new Lyapunov function of the error between the desired output and the RBF NN output is first defined. The output asymptotically converges to the desired output by designing the adaptation law in Lyapunov sense. Error convergence analysis in this paper has proven that the design of the new RBF NN training algorithm is independent of statistic properties of input and output signals. The new adaptation law has better tracking capability compared with the tracking performance of LAF in [1]-[2]. The performance of the proposed technique is illustrated through the adaptive prediction of nonlinear and nonstationary speech signals.