Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Improved sign-based learning algorithm derived by the composite nonlinear Jacobi process
Journal of Computational and Applied Mathematics - Special issue: The international conference on computational methods in sciences and engineering 2004
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part I
Hi-index | 0.00 |
We have devised noise-control algorithms, using biological adaptation as an analogy, for application to response optimization in adaptive systems generally. The present paper illustrates one of these algorithms by showing its effects on increasing the rate of learning in neural networks. Optimization procedures usually employ simulated annealing by which noise is systematically decreased at a constant rate. Our methods are time-invariant, and control the level of injected noise solely through the response of the system. Such time-invariant noise algorithms (TINA) may be more applicable than annealing to adaptive systems that must respond to unpredictable environments, and may find analogy in brain function. Both TINA and annealing have surprising properties of a new form of generalization in which networks that have been trained in the presence of noise are able to exhibit enhanced rates of learning in a subsequent learning task when no noise is present. We use special features of the geometry of error-surfaces, depicting the error as a function of changes in synaptic weights, to discuss the effect of noise in enhancing the rate of learning, and to compare learning strategies available to networks exposed to the different training procedures. The applicability of the findings to biological systems is discussed.