Neural networks and the bias/variance dilemma
Neural Computation
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Evolutionary product-unit neural networks classifiers
Neurocomputing
Gradient descent decomposition for multi-objective learning
IDEAL'11 Proceedings of the 12th international conference on Intelligent data engineering and automated learning
Expert Systems with Applications: An International Journal
A computational geometry approach for pareto-optimal selection of neural networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.01 |
A variation of the well-known Levenberg-Marquardt for training neural networks is proposed in this work. The algorithm presented restricts the norm of the weights vector to a preestablished norm value and finds the minimum error solution for that norm value. The norm constrain controls the neural networks degree of freedom. The more the norm increases, the more flexible is the neural model. Therefore, more fitted to the training set. A range of different norm solutions is generated and the best generalization solution is selected according to the validation set error. The results show the efficiency of the algorithm in terms of generalization performance.