Induction of finite-state languages using second-order recurrent networks
Neural Computation
The BSB model: a simple nonlinear autoassociative neural network
Associative neural memories
Saturation at high gain in discrete time recurrent networks
Neural Networks
Learning finite machines with self-clustering recurrent networks
Neural Computation
Characterization of periodic attractors in neural ring networks
Neural Networks
On the dynamics of small continuous-time recurrent neural networks
Adaptive Behavior - Special issue on computational neuroethology
Synchronization and desynchronization of neural oscillators
Neural Networks
Dynamical features simulated by recurrent neural networks
Neural Networks
Continuous-State Hopfield Dynamics Based on Implicit Numerical Methods
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Architectural bias in recurrent neural networks: fractal analysis
Neural Computation
Parameter space structure of continuous-time recurrent neural networks
Neural Computation
Cooperative co-evolution of multilayer perceptrons
IWANN'03 Proceedings of the Artificial and natural neural networks 7th international conference on Computational methods in neural modeling - Volume 1
Stability of Quasi-Periodic Orbits in Recurrent Neural Networks
Neural Processing Letters
Stability of quasi-periodic orbit in discrete recurrent neural network
CONTROL'05 Proceedings of the 2005 WSEAS international conference on Dynamical systems and control
Hi-index | 0.00 |
We perform a detailed fixed-point analysis of two-unit recurrent neural networks with sigmoid-shaped transfer functions. Using geometrical arguments in the space of transfer function derivatives, we partition the network state-space into distinct regions corresponding to stability types of the fixed points. Unlike in the previous studies, we do not assume any special form of connectivity pattern between the neurons, and all free parameters are allowed to vary. We also prove that when both neurons have excitatory self-connections and the mutual interaction pattern is the same (i.e., the neurons mutually inhibit or excite themselves), new attractive fixed points are created through the saddle-node bifurcation. Finally, for an N-neuron recurrent network, we give lower bounds on the rate of convergence of attractive periodic points toward the saturation values of neuron activations, as the absolute values of connection weights grow.