Monotonic multi-layer perceptron networks as universal approximators

  • Authors:
  • Bernhard Lang

  • Affiliations:
  • Siemens AG, Corporate Technology, Munich, Germany

  • Venue:
  • ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-layer perceptron networks as universal approximators are well-known methods for system identification. For many applications a multi-dimensional mathematical model has to guarantee the monotonicity with respect to one or more inputs. We introduce the MONMLP which fulfils the requirements of monotonicity regarding one or more inputs by constraints in the signs of the weights of the multi-layer perceptron network. The monotonicity of the MONMLP does not depend on the quality of the training because it is guaranteed by its structure. Moreover, it is shown that in spite of its constraints in signs the MONMLP is a universal approximator. As an example for model predictive control we present an application in the steel industry.