2010 Special Issue: Comparison of universal approximators incorporating partial monotonicity by structure

  • Authors:
  • Alexey Minin;Marina Velikova;Bernhard Lang;Hennie Daniels

  • Affiliations:
  • OOO Siemens, Monitoring and Preventive Control group, 191186, Saint-Petersburg, Volynskiy per. dom 3A liter A, Russia;Department of Radiology, Radboud University Nijmegen Medical Centre, Nijmegen, The Netherlands;OOO Siemens, Monitoring and Preventive Control group, 191186, Saint-Petersburg, Volynskiy per. dom 3A liter A, Russia;Center for Economic Research, Tilburg University, The Netherlands and ERIM Institute of Advanced Management Studies, Erasmus University Rotterdam, Rotterdam, The Netherlands

  • Venue:
  • Neural Networks
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN-MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.