A procedure for face detection & recognition
MOAS'07 Proceedings of the 18th conference on Proceedings of the 18th IASTED International Conference: modelling and simulation
Improvement of the speech recognition in noisy environments using a nonparametric regression
International Journal of Parallel, Emergent and Distributed Systems
A procedure for face detection & recognition
MS '07 The 18th IASTED International Conference on Modelling and Simulation
An efficient speech recognition system in adverse conditions using the nonparametric regression
Engineering Applications of Artificial Intelligence
Quasi-parametric recovery of Hammerstein system nonlinearity by smart model selection
ICAISC'10 Proceedings of the 10th international conference on Artifical intelligence and soft computing: Part II
On-line wavelet estimation of Hammerstein system nonlinearity
International Journal of Applied Mathematics and Computer Science
An approximation method of origin-destination flow traffic from link load counts
Computers and Electrical Engineering
Robotics and Computer-Integrated Manufacturing
Hi-index | 0.00 |
The current state of knowledge regarding nonstationary processes is significantly poorer then in the case of stationary signals. In many applications, signals are treated as stationary only because in this way it is easier to analyze them; in fact, they are nonstationary. Nonstationary processes are undoubtedly more difficult to analyze and their diversity makes application of universal tools impossible. In this paper we propose a new class of generalized regression neural networks working in nonstationary environment. The generalized regession neural networks (GRNN) studied in this paper are able to follow changes of the best model, i.e., time-varying regression functions. The novelty is summarized as follows: 1) We present adaptive GRNN tracking time-varying regression functions. 2) We prove convergence of the GRNN based on general learning theorems presented in Section IV. 3) We design in detail special GRNN based on the Parzen and orthogonal series kernels. In each case we precise conditions ensuring convergence of the GRNN to the best models described by regression function. 4) We investigate speed of convergence of the GRNN and compare performance of specific structures based on the Parzen kernel and orthogonal series kernel. 5) We study various nonstationarities (multiplicative, additive, "scale change," "movable argument") and design in each case the GRNN based on the Parzen kernel and orthogonal series kernel.