Neural networks for functional approximation and system identification
Neural Computation
Approximation of Myopic Systems Whose Inputs Need Not BeContinuous
Multidimensional Systems and Signal Processing
Separation Conditions, Myopic Maps, and Criteria for UniformApproximation of Input-Output Maps
Multidimensional Systems and Signal Processing
Universal approximation of multiple nonlinear operators by neural networks
Neural Computation
Use of built-in features in the interpretation of high-dimensional cancer diagnosis data
APBC '04 Proceedings of the second conference on Asia-Pacific bioinformatics - Volume 29
Neural Networks - 2003 Special issue: Neural network analysis of complex scientific data: Astronomy and geosciences
Neural Networks - 2003 Special issue: Neural network analysis of complex scientific data: Astronomy and geosciences
Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs
Neural Processing Letters
Design of an adaptive neural sliding-mode controller for seesaw systems
International Journal of Computer Applications in Technology
The essential approximation order for neural networks with trigonometric hidden layer units
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A novel learning algorithm for feedforward neural networks
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Supervised and unsupervised co-training of adaptive activation functions in neural nets
PSL'11 Proceedings of the First IAPR TC3 conference on Partially Supervised Learning
Fast image classification algorithms based on random weights networks
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Bayesian ARTMAP for regression
Neural Networks
Sparse algorithms of Random Weight Networks and applications
Expert Systems with Applications: An International Journal
Pattern Recognition Letters
Hi-index | 0.00 |
The purpose of this paper is to investigate neural network capability systematically. The main results are: 1) every Tauber-Wiener function is qualified as an activation function in the hidden layer of a three-layered neural network; 2) for a continuous function in S'(R1 ) to be a Tauber-Wiener function, the necessary and sufficient condition is that it is not a polynomial; 3) the capability of approximating nonlinear functionals defined on some compact set of a Banach space and nonlinear operators has been shown; and 4) the possibility by neural computation to approximate the output as a whole (not at a fixed point) of a dynamical system, thus identifying the system