An efficient hardware architecture for a neural network activation function generator
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part III
Hi-index | 0.00 |
Explored here is the ability of Cell B.E. to efficiently reveal viable solutions of nonlinear function approximation with multilayer perceptron (MLP) employing gradient descent algorithm. The capacity of Cell BE to asynchronously trace several trajectories of implemented gradient descent algorithm from random set of starting points offers advantage of revealing statistical trends and classifying viable optimal approximations delivered by simulated function generator. Approximation conditions of surfaces of 2nd and 3rd order with saddle points, such as hyperbolic paraboloid z=x2-y2, and Monkey saddle z=x3-3xy2, are determined via implementation of gradient descent algorithm (its back propagation version) for 3 layers MPL. Demonstrated are conditions of generating function approximations with (1)highly irregular error distribution, (2)close to uniform error distribution as well as (3)enhanced approximation. In the last case the overall error is significantly smaller than that programmed in the algorithm to be attained via training patterns. Such enhanced solutions offer advantage of attaining highly accurate function representation within minimized resources of MLP (i.e. with minimized number of hidden neurons in the MLP).