Fuzzy Parameter Adaptation in Optimization: Some Neural Net Training Examples

  • Authors:
  • Payman Arabshahi;Jai J. Choi;Robert J. Marks II;Thomas P. Caudell

  • Affiliations:
  • -;-;-;-

  • Venue:
  • IEEE Computational Science & Engineering
  • Year:
  • 1996

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many types of artificial neural networks require training to "learn" the tasks they will be called upon to do, which basically translates into properly setting the weights of the interneuron connections or "synapses" so that the network will give the desired output. Values of certain parameters in the algorithms used to train the networks must be chosen and then adapted or optimized carefully. Error backpropagation training of the multilayer perceptron, for example, requires judicious choice of the step and momentum parameters. The ART network requires choice of a vigilance parameter, while a learning rate must be selected for Kohonen networks. These parameters are typically chosen and adapted by a human "neural smith" using heuristic rules. For example, a smooth error surface in the backpropagation training of a layered perceptron suggests use of a long step, whereas a steep surface indicates a need for smaller steps. The heuristic description of the parameter selection is fuzzy. The terms "smooth," "long," "steep," and "smaller" are each fuzzy linguistic variables that can be quantified into a fuzzy inference engine. A properly designed fuzzy controller, then, can relieve the neural smith of some labor, and get the job done quicker and better. The concept has broader possible applications, beyond neural network training, to parameter selection and optimization generally.