Applied regression analysis and other multivariable methods
Applied regression analysis and other multivariable methods
Neural networks: a systematic introduction
Neural networks: a systematic introduction
DARPA Neural Network Stdy
Application of four-layer neural network on information extraction
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
WSEAS Transactions on Mathematics
Hi-index | 0.00 |
Multi-layered perceptron (MLP) neural networks are well known as universal approximators. They are often used as estimation tools in place of the clssical statistical methods. The focus of this study is to compare the approximation ability of MLP with a traditional statistical regression model, namely the polynomial regression. Comparison among the single hidden layer MLP, double hidden layer MLP and polynomial regression is carried out on the basis of similar number of weights or parameters. The performance of these three categories is measured using fraction of variance unexplained (FVU). The closer the FVU value is to zero, the better the estimation result and this is associated with a higher degree of accuracy. From the empirical results obtained in this study, we conclude that overall polynomial regression performs slightly better than MLP for a similar number of parameter except for the complicated interaction function. Meanwhile, double hidden layer MLP outperforms single hidden layer MLP. The MLP is more appropriate than the polynomial regression in approximating the complicated interaction function.