Accelerated learning in layered neural networks
Complex Systems
Speeding up backpropagation algorithms by using cross-entropy combined with pattern normalization
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Gradient descent learning algorithm overview: a general dynamical systems perspective
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A novel method to improve both the generalization and convergence performance of the back propagation algorithm (BP) by using multiple cost functions with a randomizing scheme is proposed in this paper. Under certain conditions, the randomized technique will converge to the global minimum with probability one. Experimental results on benchmark Encoder-Decoder problems and the NC2 classification problem show that the method is effective in enhancing BP's convergence and generalization performance