Adapting Kernels by Variational Approach in SVM
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Mixture of the robust L1 distributions and its applications
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Sparse Kernel Learning and the Relevance Units Machine
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Hi-index | 0.00 |
A new iterative procedure for solving regression problems with the so-called LASSO penalty [1] is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.