Regularization theory and neural networks architectures
Neural Computation
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
A Unified Framework for Regularization Networks and Support Vector Machines
A Unified Framework for Regularization Networks and Support Vector Machines
Variational probabilistic inference and the QMR-DT network
Journal of Artificial Intelligence Research
L1 LASSO Modeling and Its Bayesian Inference
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Habituation detection with Allen-Cahn boundary generation
International Journal of Advanced Intelligence Paradigms
Mixture of the robust L1 distributions and its applications
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Hi-index | 0.00 |
This paper proposed a variational Bayesian approach for the SVM regression based on the likelihood model of an infinite mixture of Gaussians. To evaluate this approach the method was applied to synthetic datasets. We compared this new approximation approach with the standard SVM algorithm as well as other well established methods such as Gaussian Process.