Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Multilayer feedforward networks are universal approximators
Neural Networks
Connectionist learning procedures
Artificial Intelligence
What size net gives valid generalization?
Neural Computation
Skeletonization: a technique for trimming the fat from a network via relevance assessment
Advances in neural information processing systems 1
A back-propagation algorithm with optimal use of hidden units
Advances in neural information processing systems 1
Recent advances in numerical techniques for large scale optimization
Neural networks for control
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Advances in neural information processing systems 2
A neural network approach for marketing strategies research and decision support
A neural network approach for marketing strategies research and decision support
Introduction to artificial neural systems
Introduction to artificial neural systems
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
Fundamentals of neural networks: architectures, algorithms, and applications
Fundamentals of neural networks: architectures, algorithms, and applications
Advanced Methods in Neural Computing
Advanced Methods in Neural Computing
A Method of Pruning Layered Feed-Forward Neural Networks
IWANN '93 Proceedings of the International Workshop on Artificial Neural Networks: New Trends in Neural Computation
Neural Net Learning for Intelligent Patient-Image Retrieval
IEEE Intelligent Systems
Decision support systems using hybrid neurocomputing
Design and application of hybrid intelligent systems
Application of BPN with feature-based models on cost estimation of plastic injection products
Computers and Industrial Engineering
Cost estimation of plastic injection products through back-propagation network
NN'07 Proceedings of the 8th Conference on 8th WSEAS International Conference on Neural Networks - Volume 8
Growing Algorithm of Laguerre Orthogonal Basis Neural Network with Weights Directly Determined
ICIC '08 Proceedings of the 4th international conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications - with Aspects of Artificial Intelligence
Fuzzy connectivity clustering with radial basis kernel functions
Fuzzy Sets and Systems
Engineering Applications of Artificial Intelligence
Engineering Applications of Artificial Intelligence
Estimation of Rock Mass Rating System with an Artificial Neural Network
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part III
Expert Systems with Applications: An International Journal
Journal of Computer Security
Hi-index | 0.00 |
We survey research of recent years on the supervised training of feedforward neural networks. The goal is to expose how the networks work, how to engineer them so they can learn data with less extraneous noise, how to train them efficiently, and how to assure that the training is valid. The scope covers gradient descent and polynomial line search, from backpropagation through conjugate gradients and quasi-Newton methods. There is a consensus among researchers that adaptive step gains (learning rates) can stabilize and accelerate convergence and that a good starting weight set improves both the training speed and the learning quality. The training problem includes both the design of a network function and the fitting of the function to a set of input and output data points by computing a set of coefficient weights. The form of the function can be adjusted by adjoining new neurons and pruning existing ones and setting other parameters such as biases and exponential rates. Our exposition reveals several useful results that are readily implementable.