The cascade-correlation learning architecture
Advances in neural information processing systems 2
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Time Series Analysis: Forecasting and Control
Time Series Analysis: Forecasting and Control
Evolutionary product-unit neural networks classifiers
Neurocomputing
On global-local artificial neural networks for function approximation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We apply a novel black box approximation algorithm, called IBHM, to learn both structure and parameters of a nonlinear regression model. IBHM incrementally creates a model as a weighted sum of activation functions which are nonlinear functions of the input vector. In each iteration the error between the current model and the approximated function is analyzed and a function is selected with the highest possible correlation with the observed error. This function is then added to the set of the model's activation functions and the process repeats. In effect IBHM determines both the model structure and parameter values. In this paper we briefly outline the method and present the results on the NN3 benchmark set. We compare results with other state-of-the-art methods that share a similar model structure: Multilayer Perceptron with a single hidden layer and Support Vector Regression.