Avoiding local minima in feedforward neural networks by simultaneous learning
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Hi-index | 0.00 |
It was assumed proven that two-layer feedforward neural networks with t-1 hidden nodes, when presented with t input patterns, can not have any suboptimal local minima on the error surface. In this paper, however, we shall give a counterexample to this assumption. This counterexample consists of a region of local minima with nonzero error on the error surface of a neural network with three hidden nodes when presented with four patterns (the XOR problem). We will also show that the original proof is valid only when an unusual definition of local minimum is used