The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Hi-index | 0.00 |
An improved least squares support vector regression machine (LS-SVR) is put forward and applied to non-linearity calibration of a thermocouple sensor. Solving a standard LS-SVR involves inverting a N dimensional matrix square matrix where N is the number of training samples, and it can considerably be a formidable problem when N is augmented. Sherman-Morrison-Woodbury (SMW) transformation is introduced into the solution of LS-SVR, by which the N dimensional matrix can be inverted by inverting instead another M dimensional matrix where M is the dimension of training samples. So, the large sample regression problem based on LSSVR is solved. Finally, the non-linearity calibration of a platinum-rhodium 30 -platinum-rhodium 6 thermocouple (B) sensor is taken as an example, and standard LS-SVR and improved one are used to identify the coefficients of power series-based calibration model respectively. Experimental results show that the time complexity of the improved LS-SVR-based calibration method is hardly influenced by the number of calibration data points. The method suggested may be also used for other similar applications.