Statistical analysis of a two-layer backpropagation algorithm usedfor modeling nonlinear memoryless channels: the single neuron case

  • Authors:
  • N.J. Bershad;M. Ibnkahla;F. Castanie

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA;-;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 1997

Quantified Score

Hi-index 35.68

Visualization

Abstract

Neural networks have been used for modeling the nonlinear characteristics of memoryless nonlinear channels using backpropagation (BP) learning with experimental training data. In order to better understand this neural network application, this paper studies the transient and convergence properties of a simplified two-layer neural network that uses the BP algorithm and is trained with zero mean Gaussian data. The paper studies the effects of the neural net structure, weights, initial conditions, and algorithm step size on the mean square error (MSE) of the neural net approximation. The performance analysis is based on the derivation of recursions for the mean weight update that can be used to predict the weights and the MSE over time. Monte Carlo simulations display good to excellent agreement between the actual behavior and the predictions of the theoretical model