Sparse on-line Gaussian processes
Neural Computation
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Sparse Bayesian Learning for Efficient Visual Tracking
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast generalized cross-validation algorithm for sparse model learning
Neural Computation
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Engineering Applications of Artificial Intelligence
The evidence framework applied to classification networks
Neural Computation
3D human pose from silhouettes by relevance vector regression
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Sparse gaussian processes using backward elimination
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Trajectory Voting and Classification Based on Spatiotemporal Similarity in Moving Object Databases
IDA '09 Proceedings of the 8th International Symposium on Intelligent Data Analysis: Advances in Intelligent Data Analysis VIII
Unsupervised trajectory sampling
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Leave-one-out manifold regularization
Expert Systems with Applications: An International Journal
Automatic fuzzy decision making system with learning for competing and connected businesses
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic assumption: the response is represented below certain frequency and the noise is represented above such certain frequency. However, in many case, this assumption does not hold. To overcome this limitation, a novel adaptive spherical Gaussian kernel is utilized for nonlinear regression, and the stagewise optimization algorithm for maximizing Bayesian evidence in sparse Bayesian learning framework is proposed for model selection. Extensive empirical study, on two artificial datasets and two real-world benchmark datasets, shows its effectiveness and flexibility of model on representing regression problem with higher levels of sparsity and better performance than classical RVM. The attractive ability of this approach is to automatically choose the right kernel widths locally fitting RVs from the training dataset, which could keep right level smoothing at each scale of signal.