Adaptive pattern recognition and neural networks
Adaptive pattern recognition and neural networks
Back propagation separates where perceptrons do
Neural Networks
On the Problem of Local Minima in Backpropagation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Relative Loss Bounds for Multidimensional Regression Problems
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A global optimum approach for one-layer neural networks
Neural Computation
Perceptrons: An Introduction to Computational Geometry
Perceptrons: An Introduction to Computational Geometry
A fast new algorithm for training feedforward neural networks
IEEE Transactions on Signal Processing
On a natural homotopy between linear and nonlinear single-layer networks
IEEE Transactions on Neural Networks
On the uniqueness of weights in single-layer perceptrons
IEEE Transactions on Neural Networks
Exploring and comparing the best “direct methods” for the efficient training of MLP-networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper proposes a novel supervised learning method for single-layer feedforward neural networks. This approach uses an alternative objective function to that based on the MSE, which measures the errors before the neuron's nonlinear activation functions instead of after them. In this case, the solution can be easily obtained solving systems of linear equations, i.e., requiring much less computational power than the one associated with the regular methods. A theoretical study is included to proof the approximated equivalence between the global optimum of the objective function based on the regular MSE criterion and the one of the proposed alternative MSE function. Furthermore, it is shown that the presented method has the capability of allowing incremental and distributed learning. An exhaustive experimental study is also presented to verify the soundness and efficiency of the method. This study contains 10 classification and 16 regression problems. In addition, a comparison with other high performance learning algorithms shows that the proposed method exhibits, in average, the highest performance and low-demanding computational requirements.