A class of stochastic gradient algorithms with exponentiated error cost functions

  • Authors:
  • C. Boukis;D. P. Mandic;A. G. Constantinides

  • Affiliations:
  • Athens Information Technology, Autonomic and Grid Computing Group, Markopoulo Ave, Peania/Athens 19002, Greece;Imperial College, Electrical and Electronic Engineering Department, Communications and Signal Processing Group, Exhibition Rd, London SW7 2BT, UK;Imperial College, Electrical and Electronic Engineering Department, Communications and Signal Processing Group, Exhibition Rd, London SW7 2BT, UK

  • Venue:
  • Digital Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel class of stochastic gradient descent algorithms is introduced based on the minimisation of convex cost functions with exponential dependence on the adaptation error, instead of the conventional linear combinations of even moments. The derivation is supported by rigourous analysis of the necessary conditions for convergence, the steady state mean square error is calculated and the optimal solutions in the least exponential sense are derived. The normalisation of the associated step size is also considered in order to fully exploit the dynamics of the input signal. Simulation results support the analysis.