Stochastic Gradient-Adaptive Complex-Valued Nonlinear Neural Adaptive Filters With a Gradient-Adaptive Step Size

  • Authors:
  • Su Lee Goh;D. P. Mandic

  • Affiliations:
  • Imperial Coll. London, London;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A class of variable step-size learning algorithms for complex-valued nonlinear adaptive finite impulse response (FIR) filters is proposed. To achieve this, first a general complex-valued nonlinear gradient-descent (CNGD) algorithm with a fully complex nonlinear activation function is derived. To improve the convergence and robustness of CNGD, we further introduce a gradient-adaptive step size to give a class of variable step-size CNGD (VSCNGD) algorithms. The analysis and simulations show the proposed class of algorithms exhibiting fast convergence and being able to track nonlinear and nonstationary complex-valued signals. To support the derivation, an analysis of stability and computational complexity of the proposed algorithms is provided. Simulations on colored, nonlinear, and real-world complex-valued signals support the analysis.