Incremental evolution of complex general behavior
Adaptive Behavior - Special issue on environment structure and behavior
Recurrent neural networks can learn to implement symbol-sensitive counting
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Evolving neural networks through augmenting topologies
Evolutionary Computation
Genetic Synthesis of Modular Neural Networks
Proceedings of the 5th International Conference on Genetic Algorithms
Learning precise timing with lstm recurrent networks
The Journal of Machine Learning Research
Learning to Forget: Continual Prediction with LSTM
Neural Computation
Neural Computation
Training Recurrent Networks by Evolino
Neural Computation
Empirical Studies in Action Selection with Reinforcement Learning
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Solving deep memory POMDPs with recurrent policy gradients
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
LSTM recurrent networks learn simple context-free and context-sensitive languages
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Long Short-Term Memory (LSTM) is one of the best recent supervised sequence learning methods. Using gradient descent, it trains memory cells represented as differentiable computational graph structures. Interestingly, LSTM's cell structure seems somewhat arbitrary. In this paper we optimize its computational structure using a multi-objective evolutionary algorithm. The fitness function reflects the structure's usefulness for learning various formal languages. The evolved cells help to understand crucial features that aid sequence learning.