Efficient parallel learning algorithms for neural networks
Advances in neural information processing systems 1
Learning and relearning in Boltzmann machines
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A modified back-propagation method to avoid false local minima
Neural Networks
Second-Order Methods for Neural Networks
Second-Order Methods for Neural Networks
Fundamentals of Artificial Neural Networks
Fundamentals of Artificial Neural Networks
Training neural networks with additive noise in the desired signal
IEEE Transactions on Neural Networks
A reliable resilient backpropagation method with gradient ascent
ICIC'06 Proceedings of the 2006 international conference on Intelligent computing: Part II
Hi-index | 0.00 |
A method of supervised learning for multilayer artificial neural networks to escape local minima is proposed. The learning model has two phases: a backpropagation phase and a gradient ascent phase. The backpropagation phase performs steepest descent on a surface in weight space whose height at any point in weight space is equal to an error measure, and it finds a set of weights minimizing this error measure. When the backpropagation gets stuck in local minima, the gradient ascent phase attempts to fill up the valley by modifying gain parameters in a gradient ascent direction of the error measure. The two phases are repeated until the network gets out of local minima. The algorithm has been tested on benchmark problems, such as exclusive-or (XOR), parity, alphabetic characters learning, Arabic numerals with a noise recognition problem, and a realistic real-world problem: classification of radar returns from the ionosphere. For all of these problems, the systems are shown to be capable of escaping from the backpropagation local minima and converge faster when using the new proposed method than using the simulated annealing techniques.