Approximating Unknown Mappings: An Experimental Evaluation

  • Authors:
  • Rafael Martí;Francisco Montes;Abdellah El-Fallahi

  • Affiliations:
  • Departamento de Estadística e Investigación Operativa, Universitat de València, Burjassot (Valencia), Spain 46100;Departamento de Estadística e Investigación Operativa, Universitat de València, Burjassot (Valencia), Spain 46100;Departamento de Estadística e Investigación Operativa, Universitat de València, Burjassot (Valencia), Spain 46100

  • Venue:
  • Journal of Heuristics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Different methodologies have been introduced in recent years with the aim of approximating unknown functions. Basically, these methodologies are general frameworks for representing non-linear mappings from several input variables to several output variables. Research into this problem occurs in applied mathematics (multivariate function approximation), statistics (nonparametric multiple regression) and computer science (neural networks). However, since these methodologies have been proposed in different fields, most of the previous papers treat them in isolation, ignoring contributions in the other areas. In this paper we consider five well known approaches for function approximation. Specifically we target polynomial approximation, general additive models (Gam), local regression (Loess), multivariate additive regression splines (Mars) and artificial neural networks (Ann).Neural networks can be viewed as models of real systems, built by tuning parameters known as weights. In training the net, the problem is to find the weights that optimize its performance (i.e. to minimize the error over the training set). Although the most popular method for Ann training is back propagation, other optimization methods based on metaheuristics have recently been adapted to this problem, outperforming classical approaches. In this paper we propose a short term memory tabu search method, coupled with path relinking and BFGS (a gradient-based local NLP solver) to provide high quality solutions to this problem. The experimentation with 15 functions previously reported shows that a feed-forward neural network with one hidden layer, trained with our procedure, can compete with the best-known approximating methods. The experimental results also show the effectiveness of a new mechanism to avoid overfitting in neural network training.