On temporal generalization of simple recurrent networks
Neural Networks
Information Sciences—Informatics and Computer Science: An International Journal
Image Compression by Layered Quantum Neural Networks
Neural Processing Letters
An Examination of Qubit Neural Network in Controlling an Inverted Pendulum
Neural Processing Letters
Modeling word perception using the Elman network
Neurocomputing
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
A new recurrent neural-network architecture for visual pattern recognition
IEEE Transactions on Neural Networks
Quantum neural networks (QNNs): inherently fuzzy feedforward neural networks
IEEE Transactions on Neural Networks
Chaotifying linear Elman networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Robust Neural Network Tracking Controller Using Simultaneous Perturbation Stochastic Approximation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
On the computational power of Elman-style recurrent networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a novel neural network model with hybrid quantized architecture to improve the performance of the conventional Elman networks. The quantum gate technique is introduced for solving the pattern mismatch between the inputs stream and one-time-delay state feedback. A quantized back-propagation training algorithm with an adaptive dead zone scheme is developed for providing an optimal or suboptimal tradeoff between the convergence speed and the generalization performance. Furthermore, the effectiveness of the new real time learning algorithm is demonstrated by proving the quantum gate parameter convergence based on Lyapunov method. The numerical experiments are carried out to demonstrate the accuracy of the theoretical results.