A functional approximation comparison between neural networks and polynomial regression

  • Authors:
  • Ong Hong Choon;Leong Chee Hoong;Tai Sheue Huey

  • Affiliations:
  • School of Mathematical Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia;School of Mathematical Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia;School of Mathematical Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia

  • Venue:
  • WSEAS Transactions on Mathematics
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-layered perceptron (MLP) neural networks are well known as universal approximators. They are often used as estimation tools in place of the clssical statistical methods. The focus of this study is to compare the approximation ability of MLP with a traditional statistical regression model, namely the polynomial regression. Comparison among the single hidden layer MLP, double hidden layer MLP and polynomial regression is carried out on the basis of similar number of weights or parameters. The performance of these three categories is measured using fraction of variance unexplained (FVU). The closer the FVU value is to zero, the better the estimation result and this is associated with a higher degree of accuracy. From the empirical results obtained in this study, we conclude that overall polynomial regression performs slightly better than MLP for a similar number of parameter except for the complicated interaction function. Meanwhile, double hidden layer MLP outperforms single hidden layer MLP. The MLP is more appropriate than the polynomial regression in approximating the complicated interaction function.