Neural Computation
Matrix computations (3rd ed.)
A sparse representation for function approximation
Neural Computation
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Regularization in the selection of radial basis function centers
Neural Computation
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
Relevance units latent variable model and nonlinear dimensionality reduction
IEEE Transactions on Neural Networks
Pruning least objective contribution in KMSE
Neurocomputing
Hi-index | 0.00 |
A novel significant vector (SV) regression algorithm is proposed in this paper based on an analysis of Chen's orthogonal least squares (OLS) regression algorithm. The proposed regularized SV algorithm finds the significant vectors in a successive greedy process in which, compared to the classical OLS algorithm, the orthogonalization has been removed from the algorithm. The performance of the proposed algorithm is comparable to the OLS algorithm while it saves a lot of time complexities in implementing the orthogonalization needed in the OLS algorithm.