The nature of statistical learning theory
The nature of statistical learning theory
Efficient reinforcement learning through symbiotic evolution
Machine Learning - Special issue on reinforcement learning
Solving Non-Markovian Control Tasks with Neuro-Evolution
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
LSTM recurrent networks learn simple context-free and context-sensitive languages
IEEE Transactions on Neural Networks
Modeling systems with internal state using evolino
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Training Recurrent Networks by Evolino
Neural Computation
Recurrent neural-genetic hybrids in corporate financial evaluation
SMO'06 Proceedings of the 6th WSEAS International Conference on Simulation, Modelling and Optimization
The neuronal replicator hypothesis
Neural Computation
Clifford support vector machines for classification, regression, and recurrence
IEEE Transactions on Neural Networks
ICS'10 Proceedings of the 14th WSEAS international conference on Systems: part of the 14th WSEAS CSCC multiconference - Volume I
Breast cancer detection using cartesian genetic programming evolved artificial neural networks
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Current Neural Network learning algorithms are limited in their ability to model non-linear dynamical systems. Most supervised gradient-based recurrent neural networks (RNNs) suffer from a vanishing error signal that prevents learning from inputs far in the past. Those that do not, still have problems when there are numerous local minima. We introduce a general framework for sequence learning, EVOlution of recurrent systems with LINear outputs (Evolino). Evolino uses evolution to discover good RNN hidden node weights, while using methods such as linear regression or quadratic programming to compute optimal linear mappings from hidden state to output. Using the Long Short-Term Memory RNN Architecture, the method is tested in three very different problem domains: 1) context-sensitive languages, 2) multiple superimposed sine waves, and 3) the Mackey-Glass system. Evolino performs exceptionally well across all tasks, where other methods show notable deficiencies in some.