Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Learning to learn
Theoretical models of learning to learn
Learning to learn
Learning to learn
Learning to learn
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning a Similarity Metric Discriminatively, with Application to Face Verification
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Bounds for Linear Multi-Task Learning
The Journal of Machine Learning Research
Computational and Theoretical Analysis of Null Space and Orthogonal Linear Discriminant Analysis
The Journal of Machine Learning Research
A model of inductive bias learning
Journal of Artificial Intelligence Research
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Generalization bounds for subspace selection and hyperbolic PCA
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
The Journal of Machine Learning Research
Guaranteed classification via regularized similarity learning
Neural Computation
Hi-index | 0.00 |
A method is introduced to learn and represent similarity with linear operators in kernel induced Hilbert spaces. Transferring error bounds for vector valued large-margin classifiers to the setting of Hilbert-Schmidt operators leads to dimension free bounds on a risk functional for linear representations and motivates a regularized objective functional. Minimization of this objective is effected by a simple technique of stochastic gradient descent. The resulting representations are tested on transfer problems in image processing, involving plane and spatial geometric invariants, handwritten characters and face recognition.