The Metric Structure of Weight Space
Neural Processing Letters
Empirical Evidence for Ultrametric Structure in Multi-layer Perceptron Error Surfaces
Neural Processing Letters
Neural Network Learning Using Low-Discrepancy Sequence
AI '99 Proceedings of the 12th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Dynamics of learning near singularities in layered networks
Neural Computation
Functionally equivalent feedforward neural networks
Neural Computation
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
Many feedforward neural network architectures have the propertythat their overall input-output function is unchanged by certainweight permutations and sign flips. In this paper, the geometricstructure of these equioutput weight space transformationsis explored for the case of multilayer perceptron networks withtanh activation functions (similar results hold for manyother types of neural networks). It is shown that thesetransformations form an algebraic group isomorphic to a directproduct of Weyl groups. Results concerning the root spaces of theLie algebras associated with these Weyl groups are then used toderive sets of simple equations for minimal sufficient search setsin weight space. These sets, which take the geometric forms of awedge and a cone, occupy only a minute fraction of the volume ofweight space. A separate analysis shows that large numbers ofcopies of a network performance function optimum weight vector arecreated by the action of the equioutput transformation group andthat these copies all lie on the same sphere. Some implications ofthese results for learning are discussed.