A global optimum approach for one-layer neural networks
Neural Computation
Determining regularization parameters for derivative free neural learning
MLDM'05 Proceedings of the 4th international conference on Machine Learning and Data Mining in Pattern Recognition
Hi-index | 0.00 |
In this paper the geometric formulation of the single layer perceptron weight optimization problem previously described by Coetzee et al. (1993, 1996) is combined with results from other researchers on nonconvex set projections to describe sufficient conditions for uniqueness of weight solutions. It is shown that the perceptron data surface is pseudoconvex and has infinite folding, allowing for the specification of a region of desired vectors having unique projections purely in terms of the local curvature of the data surface. No information is therefore required regarding the global curvature or size of the data surface. These results in principle allow for a posteriori evaluation of whether a weight solution is unique or globally optimal, and for a priori scaling of desired vector values to ensure uniqueness, through analysis of the input data. The practical applicability of these results from a numerical perspective is evaluated on some carefully chosen examples