Pattern recognition using neural networks: theory and algorithms for engineers and scientists
Pattern recognition using neural networks: theory and algorithms for engineers and scientists
Parameter convergence and learning curves for neural networks
Neural Computation
Deterministic convergence of an online gradient method for neural networks
Journal of Computational and Applied Mathematics - Selected papers of the international symposium on applied mathematics, August 2000, Dalian, China
Convergence of an online gradient method for feedforward neural networks with stochastic inputs
Journal of Computational and Applied Mathematics - Special issue on proceedings of the international symposium on computational mathematics and applications
Optimal convergence of on-line backpropagation
IEEE Transactions on Neural Networks
When does online BP training converge?
IEEE Transactions on Neural Networks
Hi-index | 0.04 |
An online gradient method for BP neural networks is presented and discussed. The input training examples are permuted stochastically in each cycle of iteration. A monotonicity and a weak convergence of deterministic nature for the method are proved.