Learning to learn
Learning to learn
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
A model of inductive bias learning
Journal of Artificial Intelligence Research
On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA
IEEE Transactions on Information Theory
Learning Similarity with Operator-valued Large-margin Classifiers
The Journal of Machine Learning Research
Unsupervised slow subspace-learning from stationary processes
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Learning a distance metric by empirical loss minimization
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hi-index | 0.00 |
We present a method which uses example pairs of equal or unequal class labels to select a subspace with near optimal metric properties in a kernel-induced Hilbert space. A representation of finite dimensional projections as bounded linear functionals on a space of Hilbert-Schmidt operators leads to PAC-type performance guarantees for the resulting feature maps. The proposed algorithm returns the projection onto the span of the principal eigenvectors of an empirical operator constructed in terms of the example pairs. It can be applied to meta-learning environments and experiments demonstrate an effective transfer of knowledge between different but related learning tasks.