Multilayer feedforward networks are universal approximators
Neural Networks
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Neural networks: a systematic introduction
Neural networks: a systematic introduction
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Neighborhood based Levenberg-Marquardt algorithm for neural network training
IEEE Transactions on Neural Networks
A new class of quasi-Newtonian methods for optimal learning in MLP-networks
IEEE Transactions on Neural Networks
Efficient learning algorithms for three-layer regular feedforward fuzzy neural networks
IEEE Transactions on Neural Networks
Direct search as unsupervised training algorithm for neural networks
ICS'10 Proceedings of the 14th WSEAS international conference on Systems: part of the 14th WSEAS CSCC multiconference - Volume II
Injecting Chaos in Feedforward Neural Networks
Neural Processing Letters
Hi-index | 0.00 |
Feedforward neural networks (FNN) have been proposed to solve complex problems in pattern recognition, classification and function approximation. Despite the general success of learning methods for FNN, such as the backpropagation (BP) algorithm, second-order algorithms, long learning time for convergence remains a problem to be overcome. In this paper, we propose a new hybrid algorithm for a FNN that combines unsupervised training for the hidden neurons (Kohonen algorithm) and supervised training for the output neurons (gradient descent method). Simulation results show the effectiveness of the proposed algorithm compared with other well-known learning methods.