State space modeling of time series
State space modeling of time series
Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Using neural nets to look for chaos
Conference proceedings on Interpretation of time series from nonlinear mechanical systems
An empirical methodology for developing stockmarket trading systems using artificial neural networks
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
From Takens Theorem, we know that we can characterize the autonomous part of a dynamical system by a sequence of observations. If we want to force a neural net to learn this dynamics, we can use a sequence of time horizons as output. But in normal network structures these different outputs are learned nearly independently. By a new net architecture we allow additional information flows between the different outputs and as a sequence we get a better representation of the underlying dynamical system. The net is based on multilayer feed forward architecture. The model can then be converted to its equivalent nonlinear state space representation. Using this state space form, a Coupled Neural Net algorithm based on Extended Kalman Filter is derived to estimate the state. We analyze the net of a chaotic time series (Logistic map) by using the dynamical invariant that characterizes the attractor, the largest lyaponov exponent. A detailed step by step description of the methodology is presented to facilitate the use of this new method. The pertinence of this model is discussed from the Tunisian Stock Exchange database.