Multilayer feedforward networks are universal approximators
Neural Networks
Recurrent neural networks are universal approximators
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.02 |
The paper presents a construction theorem for a class of operators dense over the set of causal, time invariant fading memory operators. In this sense, it extends the classical results of S. Boyd and L.O. Chua that the Volterra series operators are universal approximators for this set of nonlinear operators often encountered in the theory of dynamical systems. This new representation is based on the remarkable property of the neural network ΣΠ functions to be a dense algebra in the set of continuous functions over compacta in Rn. More, this class of functions is known to allow effective approximations of non-analytical type non-linearities and, as a consequence, to avoid higher order terms else way present in a polynomial decomposition. It is expected that with a proper choice of the ΣΠ base functions this property transfers to the non-linear operator representation. Following this reasoning, we are able to prove the inclusion of the Volterra series in this richer set of nonlinear operators.