Error estimation in automatic quadrature routines
ACM Transactions on Mathematical Software (TOMS)
Original Contribution: On the derivatives of the sigmoid
Neural Networks
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Mathematical Programming: Series A and B
Hybrid interior point training of modular neural networks
Neural Networks
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Fast computation in adaptive tree approximation
Numerische Mathematik
Lower Bounds for Simplicial Covers and Triangulations of Cubes
Discrete & Computational Geometry
Efficient Computation of the Hausdorff Distance Between Polytopes by Exterior Random Covering
Computational Optimization and Applications
Constructive approximate interpolation by neural networks
Journal of Computational and Applied Mathematics
Simultaneous Lp-approximation order for neural networks
Neural Networks
Adaptive mixtures of local experts
Neural Computation
Efficient estimation of neural weights by polynomial approximation
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Real-time learning capability of neural networks
IEEE Transactions on Neural Networks
Edge Detection by Adaptive Splitting
Journal of Scientific Computing
Edge Detection by Adaptive Splitting II. The Three-Dimensional Case
Journal of Scientific Computing
Hi-index | 0.01 |
In this paper we prove that any affine function defined on a d-simplex in R^d can be uniformly approximated by a single-layer neural network having only two neurons irrespective of d. The weights of this network are obtained in a closed analytical form, without training. This fact gives a correspondence rule that allows to transform mathematical approximants based on piecewise affine functions, into neural networks. We introduce such an approximant, adaptive splitting based on cubature (ASBC), for the efficient approximation of continuous functions. Using ASBC and the above correspondence rule, we obtain a neural tree. Numerical experiments on learning the function distance from a variable point to a geometric body in two and three dimensions show fast learning speed and high accuracy when compared with single-hidden layer feedforward networks trained by a trust region method based on the interior-reflective Newton algorithm.