Accelerating neural network training using weight extrapolations
Neural Networks
A New Learning Algorithm Using Simultaneous Perturbation with Weight Initialization
Neural Processing Letters
A Modified Backpropagation Training Algorithm for Feedforward Neural Networks
Neural Processing Letters
Objective functions for training new hidden units in constructive neural networks
IEEE Transactions on Neural Networks
Efficient algorithms for function approximation with piecewise linear sigmoidal networks
IEEE Transactions on Neural Networks
Fast initialization for cascade-correlation learning
IEEE Transactions on Neural Networks
Neighborhood based Levenberg-Marquardt algorithm for neural network training
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
An improved Optical Backpropagation (OBP) algorithm for training single hidden layer feedforward neural network with third term is proposed. The major limitations of backpropagation algorithm are the local minima problem and the slow rate of convergence. To solve these problems, we have proposed an algorithm by introducing a third term with optical backpropagation (OBPWT). This method has been applied to the multilayer neural network to improve the efficiency in terms of convergence speed. In the proposed algorithm, a non-linear function on the error term is introduced before applying the backpropagation phase. This error term is used along with a third term in the weight updation rule. We have shown how the new proposed algorithm drastically accelerates the training convergence at the same time maintaining the neural networkâ聙聶s performance. The effectiveness of the proposed algorithm has been shown by testing five benchmark problems. The simulation results show that the proposed algorithm is capable of speeding up the learning and hence the rate of convergence.