Learning sequential structure in simple recurrent networks
Advances in neural information processing systems 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Hi-index | 0.00 |
A new recurrent neural network-the subconnection neural network (SCNN), which has feedback-to-weight connections-is proposed for event-driven temporal sequence processing. Approximated learning algorithms for the SCNN are derived after a discussion on the SCNN with respect to the recurrence relation in gradient computation. These learning algorithms are examined in three types of event-driven temporal sequence processes: permutation, combination, and integration. It is demonstrated that the SCNN with the simplest learning algorithm is sufficiently powerful to handle all three temporal sequence processes. Simulations confirm the superiority of the SCNN over Elman's and Jordan's networks.