Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Decoding a Temporal Population Code
Neural Computation
Recurrent Neural Networks for Music Computation
INFORMS Journal on Computing
The Crystallizing Substochastic Sequential Machine Extractor: CrySSMEx
Neural Computation
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Parallel Realisation of the Recurrent RTRN Neural Network Learning
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
Identification of finite state automata with a class of recurrent neural networks
IEEE Transactions on Neural Networks
Parallel realisation of the recurrent Elman neural network learning
ICAISC'10 Proceedings of the 10th international conference on Artifical intelligence and soft computing: Part II
EA'09 Proceedings of the 9th international conference on Artificial evolution
Functional architectures and hierarchies of time scales
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Memory in backpropagation-decorrelation O(N) efficient online recurrent learning
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
A study of particle swarm optimization in gene regulatory networks inference
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part III
Learning long term dependencies with recurrent neural networks
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Parallel realisation of the recurrent multi layer perceptron learning
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part I
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Adaptive learning of linguistic hierarchy in a multiple timescale recurrent neural network
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
From the Book:The idea for this book emerged as the two of us were organizing the 1996 Neural Information Processing Systems (NIPS) workshop on recurrent networks. Many of the telephone calls between us turned to students and research. We both bemoaned the fact that we were handing large stacks of journal papers to our potential graduate students, in some cases, scaring them off to other advisors. Since we both were dealing with terminal masters students whom we would be lucky if they stuck around for a whole year, we needed introductory material that was self-contained, accessible, and devoid of distractions聴a book that a first-year graduate student, or upper-division undergrad, could read and become productive in a short period of time. Unfortunately, there were no such books available at that time. Hertz, Krogh, and Palmer聮s textbook serves well in getting students up to speed on back propagation, Hopfield networks, and Boltzmann machines. However, it did little justice to our research area, dynam-ical recurrent networks. Sometime before the workshop, we decided to fill this void. From the conception of this project, we both agreed that this book was not going to be yet another edited collection of papers. We wanted a coherent presentation of the field. One difficulty for students has been the arbitrary nomenclature and mathematical notation used to describe recurrent networks. Though not insurmountable, it does act as a speed bump, reducing the valuable time that they could be spending elsewhere. Like any other young field, we have several researchers with competing viewpoints. One problem with edited volumes is that the author of a chapter will have a biased view of the world. We decided to reduce this bias by assigning multiple authors to chapters. The goal of the assignments was to ensure a balanced presentation of the material, one that would be coming directly from the experts in that field. We also encouraged the authors to tie their chapters with other chapters in the book. Though difficult at times, it transforms the text from a collection of papers into a coherent book. The book is organized into five sections. The first section presents the range of dynamical recurrent network (DRN) architectures that will be used in the book. With these architectures in hand, we turn to examine their capabilities as computational devices. The third section presents several training algorithms for solving the network loading problem. Next, we look back and temper our enthusiasm by acknowledging the limitations of DRNs. The final section deals with applications of recurrent networks. The material in this book assumes a basic understanding of neural networks. The reader, we assume, has read an introductory textbook covering multilayered perceptrons and back-propagation. As such, this book would be ideal for a second course in neural networks.