Accelerated learning in layered neural networks
Complex Systems
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
On the Problem of Local Minima in Backpropagation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Accelerating backpropagation through dynamic self-adaptation
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
The application of neural networks to the papermaking industry
IEEE Transactions on Neural Networks
An adaptable time-delay neural-network algorithm for image sequence analysis
IEEE Transactions on Neural Networks
The modified fuzzy art and a two-stage clustering approach to cell design
Information Sciences: an International Journal
Information Sciences: an International Journal
A modified gradient-based neuro-fuzzy learning algorithm and its convergence
Information Sciences: an International Journal
Modeling of spring-back in V-die bending process by using fuzzy learning back-propagation algorithm
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In this paper, an effective learning of neural network by using random fuzzy back-propagation (RFBP) learning algorithm is developed. Based on this new learning algorithm, neural network not only has an accurate learning capability, but also can increase the probability of escaping from the local minimum while neural network is training. For demonstrating the new algorithm we developed has its outperformance, the classifications of the non-convex in two dimensions (NC2) problem are simulated. For comparison, the same simulations by using conventional back-propagation (BP) learning algorithm with constant pairs of learning rate (α = 0.1-0.9) and momentum (ξ = 0.1-0.9) and stochastic BP learning are also performed.