SIAM Review
An Interior-Point Method for Approximate Positive Semidefinite Completions
Computational Optimization and Applications
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
The em algorithm for kernel matrix completion with auxiliary data
The Journal of Machine Learning Research
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Incomplete-data classification using logistic regression
ICML '05 Proceedings of the 22nd international conference on Machine learning
Model-based transductive learning of the kernel matrix
Machine Learning
Location Estimation via Support Vector Regression
IEEE Transactions on Mobile Computing
On Classification with Incomplete Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection and kernel design via linear programming
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Classification with Incomplete Data Using Dirichlet Process Priors
The Journal of Machine Learning Research
Clustering pairwise distances with missing data: maximum cuts versus normalized cuts
DS'06 Proceedings of the 9th international conference on Discovery Science
Kernel regression with sparse metric learning
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
We consider the problem of missing data in kernel-based learning algorithms. We explain how semidefinite programming can be used to perform an approximate weighted completion of the kernel matrix that ensures positive semidefiniteness and hence Mercer's condition. In numerical experiments we apply a support vector machine to the XOR classification task based on randomly sparsified kernel matrices from a polynomial kernel of degree 2. The approximate completion algorithm leads to better generalisation and to fewer support vectors as compared to a simple spectral truncation method at the cost of considerably longer runtime.We argue that semidefinite programming provides an interesting convex optimisation framework for machine learning in general and for kernel-machines in particular.