Applying LSTM to Time Series Predictable through Time-Window Approaches
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Time delay learning by gradient descent in recurrent neural networks
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Hi-index | 0.00 |
When long-term dependencies are present in a time series, the approximation capabilities of recurrent neural networks are difficult to exploit by gradient descent algorithms. It is easier for such algorithms to find good solutions if one include connections with time delays in the recurrent networks. One can choose the locations and delays for these connections by the heuristic presented here. As shown on two benchmark problems, this heuristic produces very good results while keeping the total number of connections in the recurrent network to a minimum.