Introduction to the theory of neural computation
Introduction to the theory of neural computation
How to solve the N bit encoder problem with just two hidden units
Neural Computation
On the geometry of feedforward neural network error surfaces
Neural Computation
Fractal strategies for neural network scaling
The handbook of brain theory and neural networks
The lack of a priori distinctions between learning algorithms
Neural Computation
Hi-index | 0.00 |
Combinatorial optimization problems share an interesting property with spin glass systems in that their state spaces can exhibit ultrametric structure. We use sampling methods to analyse the error surfaces of feedforward multi-layer perceptron neural networks learning encoder problems. The third order statistics of these points of attraction are examined and found to be arranged in a highly ultrametric way. This is a unique result for a finite, continuous parameter space. The implications of this result are discussed.