Fast training of neural trees by adaptive splitting based on cubature

  • Authors:
  • B. Llanas;F. J. Sáinz

  • Affiliations:
  • Departamento de Matemática Aplicada, ETSI de Caminos, Universidad Politécnica de Madrid, C/Profesor Aranguren s/n, Ciudad Universitaria, 28040 Madrid, Spain;Departamento de Matemática Aplicada, ETSI de Caminos, Universidad Politécnica de Madrid, C/Profesor Aranguren s/n, Ciudad Universitaria, 28040 Madrid, Spain

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper we prove that any affine function defined on a d-simplex in R^d can be uniformly approximated by a single-layer neural network having only two neurons irrespective of d. The weights of this network are obtained in a closed analytical form, without training. This fact gives a correspondence rule that allows to transform mathematical approximants based on piecewise affine functions, into neural networks. We introduce such an approximant, adaptive splitting based on cubature (ASBC), for the efficient approximation of continuous functions. Using ASBC and the above correspondence rule, we obtain a neural tree. Numerical experiments on learning the function distance from a variable point to a geometric body in two and three dimensions show fast learning speed and high accuracy when compared with single-hidden layer feedforward networks trained by a trust region method based on the interior-reflective Newton algorithm.