Object Recognition from Local Scale-Invariant Features
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
A robust minimax approach to classification
The Journal of Machine Learning Research
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Hierarchic Bayesian models for kernel learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
The Journal of Machine Learning Research
Multi-view discriminant analysis
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part I
Hi-index | 0.00 |
There are many situations in which we have more than one view of a single data source, or in which we have multiple sources of data that are aligned. We would like to be able to build classifiers which incorporate these to enhance classification performance. Kernel Fisher Discriminant Analysis (KFDA) can be formulated as a convex optimisation problem, which we extend to the Multiview setting (MFDA) and introduce a sparse version (SMFDA). We show that our formulations are justified from both probabilistic and learning theory perspectives. We then extend the optimisation problem to account for directions unique to each view (PMFDA). We show experimental validation on a toy dataset, and then give experimental results on a brain imaging dataset and part of the PASCAL 2007 VOC challenge dataset.