Neural networks and the bias/variance dilemma
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Nonparametric econometrics
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Learning from data with localized regression and differential evolution
Learning from data with localized regression and differential evolution
Hi-index | 0.00 |
This paper presents an analytical derivation and analysis of the uncertainty of the Multivariate State Estimation Technique (MSET). Like all other nonparametric techniques, MSET uncertainty consists of two parts: bias and variance. Bias is a systematic error in MSET inference and practically not computable and non-removable, but when properly regularized it is usually very small with respect to the variance when properly regularized. Variance, on the other hand, represents variability of the MSET estimate due to random noise in the data and can be estimated in real time. All the derivations and results are obtained for the inferential case. The MSET cost function is also derived which shows that MSET minimizes a weighted least squares cost function with weighting affected by the MSET memory matrix. The parallels between MSET and more traditional kernel techniques, namely kernel regression, are drawn and it is shown that MSET is a special type of kernel regression algorithm. The final section presents the results of the MSET uncertainty analysis for real world data obtained from a commercial nuclear power plant.