Identifiability of parametric models
Identifiability of parametric models
Information-based objective functions for active data selection
Neural Computation
Machine Learning
Neural network exploration using optimal experiment design
Neural Networks
Functional Adaptive Control: An Intelligent Systems Approach
Functional Adaptive Control: An Intelligent Systems Approach
Survey paper: Optimal experimental design and some related control problems
Automatica (Journal of IFAC)
From experiment design to closed-loop control
Automatica (Journal of IFAC)
Recursive bayesian estimation using gaussian sums
Automatica (Journal of IFAC)
Statistical active learning in multilayer perceptrons
IEEE Transactions on Neural Networks
Toward the training of feed-forward neural networks with the D-optimum input sequence
IEEE Transactions on Neural Networks
Sequential Monte Carlo for Bayesian sequentially designed experiments for discrete data
Computational Statistics & Data Analysis
Hi-index | 0.01 |
Design of an optimal input signal in system identification using a multi-layer perceptron network is treated. Neural networks of the same structure differing only in parameter values are able to approximate various nonlinear mappings. To ensure high quality of network parameter estimates, it is crucial to find a suitable input signal. It is shown that utilizing the conditional probability density function of parameters for design of the input signal provides better results than currently used procedures based on parameter point estimates only. The conditional probability density function of parameters is unknown and hence it is estimated using the Gaussian sum approach approximating arbitrary probability density function by a sum of normal distributions. This approach is less computationally demanding than the Markov Chain Monte Carlo method and achieves better results in comparison with the commonly used local prediction error methods. The properties of the proposed input signal designs are illustrated in numerical examples.