Multilayer feedforward networks are universal approximators
Neural Networks
Kolmogorov's theorem and multilayer neural networks
Neural Networks
Approximation by superposition of sigmoidal and radial basis functions
Advances in Applied Mathematics
New ideas in optimization
The econometric analysis of seasonal time series
The econometric analysis of seasonal time series
Artificial Higher Order Neural Networks for Economics and Business
Artificial Higher Order Neural Networks for Economics and Business
Nonlinear dynamic system identification using Chebyshev functionallink artificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Quarterly Time-Series Forecasting With Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper is a response to difficulties reported in applying feedforward neural networks (NNs) to seasonal data. The solution we propose is a modified network model which is pruned and optimised by means of Differential Evolution methods. The problem for NNs in the case of seasonality lies in the so-called 'universal approximation' property, which underpins the use of MLP networks as a vehicle for flexible non-linear regression. Our view is that seasonality is best modelled by using sinusoids, which permit the use of more powerful analytical tools without losing any generality as compared with dummy variables. However, the actual theorems supporting NN approximation specifically relate to functions possessing suitable properties of smoothness, in which case it is not surprising that NNs have difficulty with seasonality. Only a very 'short' sinusoid would be smooth enough. Our suggested solution is to transform the input variable so that instead of using a time variable alone we have sinusoids as inputs. In theoretical terms, this helps restore the approximation property, as can also be seen in our examples, which also serve to illustrate the strength of Differential Evolution methods.