Inverse mapping of continuous functions using local and global information

  • Authors:
  • S. Lee;R. M. Kil

  • Affiliations:
  • Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a method for solving inverse mapping of a continuous function learned by a multilayer feedforward mapping network. The method is based on the iterative update of input vector toward a solution, while escaping from local minima. The input vector update is determined by the pseudo-inverse of the gradient of Lyapunov function, and, should an optimal solution be searched for, the projection of the gradient of a performance index on the null space of the gradient of Lyapunov function. The update rule is allowed to detect an input vector approaching local minima through a phenomenon called “update explosion”. At or near local minima, the input vector is guided by an escape trajectory generated based on “global information”, where global information is referred to here as predefined or known information on forward mapping; or the input vector is relocated to a new position based on the probability density function (PDF) constructed over the input vector space by Parzen estimate. The constructed PDF reflects the history of local minima detected during the search process, and represents the probability that a particular input vector can lead to a solution based on the update rule. The proposed method has a substantial advantage in computational complexity as well as convergence property over the conventional methods based on Jacobian pseudo-inverse or Jacobian transpose