Generalized regression neural networks in time-varying environment

  • Authors:
  • L. Rutkowski

  • Affiliations:
  • Dept. of Comput. Eng., Tech. Univ. of Czestochowa, Lodz, Poland

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The current state of knowledge regarding nonstationary processes is significantly poorer then in the case of stationary signals. In many applications, signals are treated as stationary only because in this way it is easier to analyze them; in fact, they are nonstationary. Nonstationary processes are undoubtedly more difficult to analyze and their diversity makes application of universal tools impossible. In this paper we propose a new class of generalized regression neural networks working in nonstationary environment. The generalized regession neural networks (GRNN) studied in this paper are able to follow changes of the best model, i.e., time-varying regression functions. The novelty is summarized as follows: 1) We present adaptive GRNN tracking time-varying regression functions. 2) We prove convergence of the GRNN based on general learning theorems presented in Section IV. 3) We design in detail special GRNN based on the Parzen and orthogonal series kernels. In each case we precise conditions ensuring convergence of the GRNN to the best models described by regression function. 4) We investigate speed of convergence of the GRNN and compare performance of specific structures based on the Parzen kernel and orthogonal series kernel. 5) We study various nonstationarities (multiplicative, additive, "scale change," "movable argument") and design in each case the GRNN based on the Parzen kernel and orthogonal series kernel.