Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
The general regression neural network-Rediscovered
Neural Networks
Advanced algorithms for neural networks: a C++ sourcebook
Advanced algorithms for neural networks: a C++ sourcebook
Learning experiments with genetic optimization of a generalized regression neural network
Decision Support Systems - Special double issue: unified programming
Evolutionary product unit based neural networks for regression
Neural Networks
Variations of the two-spiral task
Connection Science
Appearance-Based Map Learning for Mobile Robot by Using Generalized Regression Neural Network
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Feature subset selection in large dimensionality domains
Pattern Recognition
Advanced Engineering Informatics
Incremental learning of spatio-temporal patterns with model selection
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Research on the electric load forecasting and risk assessment based on wavelet neural network
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
International Journal of Intelligent Systems in Accounting and Finance Management
Generalized classifier neural network
Neural Networks
Advances in Artificial Neural Systems
Hi-index | 0.01 |
A Modified General Regression Neural Network (MGRNN) is presented as an easy-to-use 'black box'-tool to feed in available data and obtain a reasonable regression surface. The MGRNN is based on the General Regression Neural Network by D. Specht [Specht, D. (1991). A General Regression Neural Network. IEEE Transactions on Neural Networks, 2(6), 568-576], therefore, the network's architecture and weights are determined. The kernel width of each training sample is trained by two supervised training algorithms. These fast and reliable algorithms require four user-definable parameters, but are robust against changes of the parameters. Its generalization ability was tested with different benchmarks: intertwined spirals, Mackey--Glass time series and PROBEN1. The MGRNN provides two additional features: (1) it is trainable with arbitrary data as long as a suitable metric exists. Particularly, it is unnecessary to force the data structure to vectors of equal length; (2) it is able to compute the gradient of the regression surface as long as the gradient of the metric is definable and defined. The MGRNN solves common practical problems of common feed-forward networks.