Multilayer feedforward networks are universal approximators
Neural Networks
A feedforward neural network with function shape autotuning
Neural Networks
On function recovery by neural networks based on orthogonal expansions
Proceedings of the second world congress on Nonlinear analysts: part 3
International Journal of Intelligent Systems
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Modeling wine preferences by data mining from physicochemical properties
Decision Support Systems
Multilayer perceptron network with modified sigmoid activation functions
AICI'10 Proceedings of the 2010 international conference on Artificial intelligence and computational intelligence: Part I
Multilayer feedforward networks with adaptive spline activation function
IEEE Transactions on Neural Networks
Selection of activation functions in the last hidden layer of the multilayer perceptron
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part I
Matrix pseudoinversion for image neural processing
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
Hi-index | 0.00 |
This article presents a fast and uncomplicated method to modify multilayer perceptrons allowing for a considerable single-step reduction of the cost function which in this case is the mean of squared errors. The method consists in, but is not limited to the change of neuron activation functions in the last hidden layer and in the single application of the least squares method. No changes are made to neuron weights in any hidden layer. Some essential strong points of the method lie in the fact that it can be used to improve operation of networks trained earlier and the learning process need not be started from the very beginning.