Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
The Induction of Dynamical Recognizers
Machine Learning - Connectionist approaches to language learning
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
An introduction to symbolic dynamics and coding
An introduction to symbolic dynamics and coding
Multiple paired forward and inverse models for motor control
Neural Networks - Special issue on neural control and robotics: biology and technology
Neural Networks - Special issue on organisation of computation in brain-like systems
Neural Computation
Adaptive mixtures of local experts
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Learning long-term dependencies with gradient descent is difficult
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This study shows that a mixture of RNN experts model can acquire the ability to generate sequences that are combination of multiple primitive patterns by means of self-organizing chaos. By training the model, each expert learns a primitive sequence pattern, and a gating network learns to imitate stochastic switching of the multiple primitives via chaotic dynamics, utilizing a sensitive dependence on initial conditions. As a demonstration, we present a numerical simulation in which the model learns Markov chain switching among some Lissajous curves by chaotic dynamics. Our analysis shows that by using a sufficient amount of training data, balanced with the network memory capacity, it is possible to satisfy the conditions for embedding the target stochastic sequences into a chaotic dynamical system. It is also shown that reconstruction of a stochastic time series by a chaotic model can be stabilized by adding a negligible amount of noise to the dynamics of the model.