Improving the convergence of the back-propagation algorithm
Neural Networks
Accelerating backpropagation through dynamic self-adaptation
Neural Networks
Efficient Block Training of Multilayer Perceptrons
Neural Computation
Adaptive improved natural gradient algorithm for blind source separation
Neural Computation
Incremental backpropagation learning networks
IEEE Transactions on Neural Networks
Dynamic tunneling technique for efficient training of multilayer perceptrons
IEEE Transactions on Neural Networks
A new error function at hidden layers for past training of multilayer perceptrons
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
H∞-learning of layered neural networks
IEEE Transactions on Neural Networks
Deterministic nonmonotone strategies for effective training of multilayer perceptrons
IEEE Transactions on Neural Networks
Magnified gradient function with deterministic weight modification in adaptive learning
IEEE Transactions on Neural Networks
A New Adaptive Backpropagation Algorithm Based on Lyapunov Stability Theory for Neural Networks
IEEE Transactions on Neural Networks
An accelerated learning algorithm for multilayer perceptrons: optimization layer by layer
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, the local coupled feedforward neural network is presented. Its connection structure is same as that of Multilayer Perceptron with one hidden layer. In the local coupled feedforward neural network, each hidden node is assigned an address in an input space, and each input activates only the hidden nodes near it. For each input, only the activated hidden nodes take part in forward and backward propagation processes. Theoretical analysis and simulation results show that this neural network owns the ''universal approximation'' property and can solve the learning problem of feedforward neural networks. In addition, its characteristic of local coupling makes knowledge accumulation possible.