Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Original Contribution: A convergent generator of neural networks
Neural Networks
Some new results on neural network approximation
Neural Networks
A relaxation algorithm for estimating the domain of validity of feedforward neural networks
Neural Processing Letters
The String-to-String Correction Problem
Journal of the ACM (JACM)
An Extension of the String-to-String Correction Problem
Journal of the ACM (JACM)
Two methods for encoding clusters
Neural Networks
Interpolation by radial basis functions on Sobolev space
Journal of Approximation Theory
The Hyperbell Algorithm for Global Optimization: A Random Walk Using Cauchy Densities
Journal of Global Optimization
Straight monotonic embedding of data sets in Euclidean spaces
Neural Networks
Density codes and shape spaces
Neural Networks
The cooperative conceptualization of urban spaces in AI-assisted environmental planning
CDVE'09 Proceedings of the 6th international conference on Cooperative design, visualization, and engineering
Hi-index | 0.00 |
This paper presents a family of layered feed-forward networks that is able to uniformly approximate functions on any metric space, and also on a wide variety of non-metric spaces. Non-Euclidean input spaces are frequently encountered in practice, while usual approximation schemes are guaranteed to work only on Euclidean metric spaces. Theoretical foundations are provided, as well as practical algorithms and illustrative examples. This tool potentially constitutes a significant extension of the common notion of 'universal approximation capability'.