Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Field Guide to Dynamical Recurrent Networks
Field Guide to Dynamical Recurrent Networks
Learning the Long-Term Structure of the Blues
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Learning to Forget: Continual Prediction with LSTM
Neural Computation
Neural Computation
Proceedings of the 2008 ACM symposium on Applied computing
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
SANTIAGO: a real-time biological neural network environment for generative music creation
EvoApplications'11 Proceedings of the 2011 international conference on Applications of evolutionary computation - Volume Part II
Temperature forecasting with a dynamic higher-order neural network model
Proceedings of the 13th International Conference on Information Integration and Web-based Applications and Services
Hi-index | 0.00 |
Some researchers in the computational sciences have considered music computation, including music reproduction and generation, as a dynamic system, i.e., a feedback process. The key element is that the state of the musical system depends on a history of past states. Recurrent (neural) networks have been deployed as models for learning musical processes. We first present a tutorial discussion of recurrent networks, covering those that have been used for music learning. Following this, we examine a thread of development of these recurrent networks for music computation that shows how more intricate music has been learned as the state of the art in recurrent networks improves. We present our findings that show that a long short-term memory recurrent network, with new representations that include music knowledge, can learn musical tasks, and can learn to reproduce long songs. Then, given a reharmonization of the chordal structure, it can generate an improvisation.