NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Monotonic multi-layer perceptron networks as universal approximators
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Hi-index | 0.00 |
Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required, on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise the stability of most of the control laws is lost. Three approaches for partially monotonic models are compared in this article, namely Bounded Derivative Network (BDN) [1], Monotonic Multi-Layer Perceptron Network (MONMLP) [2], and Constrained Linear Regression (CLR). Authors investigated the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.