Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Construction of tunable radial basis function networks using orthogonal forward selection
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Sparse modeling using orthogonal forward regression with PRESS statistic and regularization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
A class of fast identification algorithms is introduced for Gaussian process (GP) models. The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank, a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions. The objective of either maximizing the marginal likelihood or the Kullback-Leibler (K-L) divergence between the estimated output probability density function (pdf) and the true pdf has been used as respective cost functions. For each cost function, an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.