Comparison of Neural Networks Incorporating Partial Monotonicity by Structure

  • Authors:
  • Alexey Minin;Bernhard Lang

  • Affiliations:
  • Saint-Petersburg State University,;OOO Siemens, Fault Analysis and Prevention group, 191186, Russia Saint-Petersburg, Volynskiy per. dom 3A liter A,

  • Venue:
  • ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required, on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise the stability of most of the control laws is lost. Three approaches for partially monotonic models are compared in this article, namely Bounded Derivative Network (BDN) [1], Monotonic Multi-Layer Perceptron Network (MONMLP) [2], and Constrained Linear Regression (CLR). Authors investigated the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.