Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The local minima of the error surface of the 2-2-1 XOR network
Annals of Mathematics and Artificial Intelligence
Neural Network Learning Using Low-Discrepancy Sequence
AI '99 Proceedings of the 12th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Neural Computation
Effect of refractoriness on learning performance of a pattern sequence
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Hi-index | 0.00 |
The artificial neural network with one hidden unit and the input units connected to the output unit is considered. It is proven that the error surface of this network for the patterns of the XOR problem has minimum values with zero error and that all other stationary points of the error surface are saddlepoints. Also, the volume of the regions in weight space with saddlepoints is zero, hence training this network on the four patterns of the XOR problem using, e.g., backpropagation with momentum, the correct solution with error zero will be reached in the limit with probability one.