Deterministic Boltzmann learning performs steepest descent in weight-space
Neural Computation
Neural networks and the bias/variance dilemma
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Gradient descent decomposition for multi-objective learning
IDEAL'11 Proceedings of the 12th international conference on Intelligent data engineering and automated learning
A computational geometry approach for pareto-optimal selection of neural networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
In this work a modification was made on the algorithm of Artificial Neural Networks (NN) Training of the Multilayer Perceptron type (MLP) based on multi-objective optimization (MOBJ), to increase its computational efficiency. Usually, the number of efficient solutions to be generated is a parameter that must be provided by the user. In this work, this number is automatically determined by an algorithm, through the usage of golden section, being generally less when specified, showing a sensible reduction in the processing time and keeping the high generalization capability of the obtained solution from the original method.